The wait is finally over and our offline Mobile SDK is here and ready for liftoff (woohoo)! Here are a few of the more frequent questions that we've received on it so far:
I want in!!!! How do I get access to this?
Great question! Our Apple and Android SDKs are currently available via the GitHub links above so feel free to dive right in!
Is it available on all platforms?
Not quite, but we'll get there. At the moment it's fully available on iOS and you can use it with either Swift or Objective-C for that. The Android version uses Java.
How big is it?
16 fun-filled megabytes
Can it do everything that the regular API can?
Well, we're getting there. The SDK can do predictions on our General, NSFW and Face Detection Models and also any custom models (all offline of course)! Visual Search is also in the roadmap so stay tuned for that.
Is there any sample code that I can use?
Yep! Check out the Repos for those.
What happens to images that are used to train a model?
Models that are created and trained on your device won’t have their inputs leave the device unless you explicitly make that happen. Thus, you will never have to worry about any privacy issues.
Does it work with videos?
Not yet but we're working on it!
How much does it cost?
Right now it is completely free for one phone to use but only for a limited time! (Don’t worry we’ll tell you when that happens first)
I'm getting "Undefined symbols for architecture x86_64: _OBJC_CLASS_$_Clarifai". Why is that?
Typically this means that you'll simply need to install Git-LFS on your system, but you'll want to do all of the following to be on the safe side:
- Install Git-LFS
- Upgrade your cocoa-pods using
sudo gem install cocoapods
- Upgrade your local CocoaPods repository. You may need to force upgrade your local, cached, repository with: pod install --repo-update
If you're still having problems please email email@example.com and we'll take a further look!