The wait is finally over and our offline Mobile SDK is here and ready for liftoff (woohoo)! Here are a few of the more frequent questions that we've received on it so far:
I want in!!!! How do I get access to this?
Great question! The SDK is currently available via the GitHub link above so feel free to dive right in!
Is it available on all platforms?
Not quite, but we'll get there. At the moment it's only available on iOS and you can use with either Swift or Objective-C.
But what about Android?!?! I wanna get my Java on!
We hear you Android Universe! We're certainly planning on releasing this in the near-future and hopefully one for Windows as well :-)
How big is it?
16 fun-filled megabytes
Can it do everything that the regular API can?
Well, we're getting there. The SDK can do predictions on our General Model and also any custom models (all offline of course)! Visual Search is also in the roadmap so stay tuned for that.
Is there any sample code that I can use?
Yep! Check out the Repo for those.
What happens to images that are used to train a model?
Models that are created and trained on your device won’t have their inputs leave the device unless you explicitly make that happen. Thus, you will never have to worry about any privacy issues.
Does it work with videos?
Not yet but we're working on it!
How much does it cost?
Right now it is completely free in the trial period but only for a limited time! (Don’t worry we’ll tell you when that happens first)
I'm getting "Undefined symbols for architecture x86_64: _OBJC_CLASS_$_Clarifai". Why is that?
Typically this means that you'll simply need to install Git-LFS on your system, but you'll want to do all of the following to be on the safe side:
- Install Git-LFS
- Upgrade your cocoa-pods using
sudo gem install cocoapods
- Upgrade your local CocoaPods repository. You may need to force upgrade your local, cached, repository with: pod install --repo-update
If you're still having problems please email email@example.com and we'll take a further look!