Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to a new MLKit library which runs on-device #150

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 2 additions & 5 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,5 @@ jobs:

steps:
- uses: actions/checkout@v1
- run: sudo chown -R cirrus:cirrus ./ /github/home/
- name: Flutter pub get
run: flutter packages get
- name: Flutter analyze --suppress-analytics
run: flutter analyze --suppress-analytics
- run: flutter pub get
- run: flutter analyze --suppress-analytics
33 changes: 14 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,14 +16,10 @@ First, add `flutter_camera_ml_vision` as a dependency.
dependencies:
flutter:
sdk: flutter
flutter_camera_ml_vision: ^2.2.4
flutter_camera_ml_vision: ^3.0.1
...
```

## Configure Firebase
You must also configure Firebase for each platform project: Android and iOS (see the `example` folder or https://firebase.google.com/codelabs/firebase-get-to-know-flutter#3 for step by step details).


### iOS

Add two rows to the ios/Runner/Info.plist:
Expand All @@ -42,10 +38,10 @@ Or in text format add the key:
If you're using one of the on-device APIs, include the corresponding ML Kit library model in your Podfile. Then run pod update in a terminal within the same directory as your Podfile.

```
pod 'Firebase/MLVisionBarcodeModel'
pod 'Firebase/MLVisionFaceModel'
pod 'Firebase/MLVisionLabelModel'
pod 'Firebase/MLVisionTextModel'
pod 'GoogleMLKit/BarcodeScanning' '~> 2.2.0'
pod 'GoogleMLKit/FaceDetection' '~> 2.2.0'
pod 'GoogleMLKit/ImageLabeling' '~> 2.2.0'
pod 'GoogleMLKit/TextRecognition' '~> 2.2.0'
```

### Android
Expand All @@ -65,7 +61,7 @@ android {
dependencies {
// ...

api 'com.google.firebase:firebase-ml-vision-image-label-model:19.0.0'
api 'com.google.mlkit:image-labeling:17.0.5'
}
}
```
Expand All @@ -78,7 +74,7 @@ Optional but recommended: If you use the on-device API, configure your app to au
<application ...>
...
<meta-data
android:name="com.google.firebase.ml.vision.DEPENDENCIES"
android:name="com.google.mlkit.vision.DEPENDENCIES"
android:value="ocr" />
<!-- To use multiple models: android:value="ocr,label,barcode,face" -->
</application>
Expand All @@ -90,7 +86,7 @@ Optional but recommended: If you use the on-device API, configure your app to au

```dart
CameraMlVision<List<Barcode>>(
detector: FirebaseVision.instance.barcodeDetector().detectInImage,
detector: GoogleMlKit.vision.barcodeScanner().processImage,
onResult: (List<Barcode> barcodes) {
if (!mounted || resultSent) {
return;
Expand All @@ -101,15 +97,14 @@ CameraMlVision<List<Barcode>>(
)
```

`CameraMlVision` is a widget that shows the preview of the camera. It takes a detector as a parameter here we pass the `detectInImage` method of the `BarcodeDetector` object.
The detector parameter can take all the different FirebaseVision Detector. Here is a list :
`CameraMlVision` is a widget that shows the preview of the camera. It takes a detector as a parameter; here we pass the `processImage` method of the `BarcodeScanner` object.
The detector parameter can take all the different MLKit Vision Detectors. Here is a list :

```
FirebaseVision.instance.barcodeDetector().detectInImage
FirebaseVision.instance.cloudLabelDetector().detectInImage
FirebaseVision.instance.faceDetector().processImage
FirebaseVision.instance.labelDetector().detectInImage
FirebaseVision.instance.textRecognizer().processImage
GoogleMlKit.vision.barcodeScanner().processImage
GoogleMlKit.vision.imageLabeler().processImage
GoogleMlKit.vision.faceDetector().processImage
GoogleMlKit.vision.textDetector().processImage
```

Then when something is detected the onResult callback is called with the data in the parameter of the function.
Expand Down
4 changes: 1 addition & 3 deletions example/android/app/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -55,10 +55,8 @@ flutter {
}

dependencies {
api 'com.google.firebase:firebase-ml-vision-barcode-model:16.1.2'
implementation 'com.google.mlkit:barcode-scanning:17.0.0'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test:runner:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'
}

apply plugin: 'com.google.gms.google-services'
19 changes: 5 additions & 14 deletions example/android/app/src/main/AndroidManifest.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,36 +2,27 @@
package="fr.rushioconsulting.flutter_camera_ml_vision_example"
xmlns:tools="http://schemas.android.com/tools">

<!-- io.flutter.app.FlutterApplication is an android.app.Application that
calls FlutterMain.startInitialization(this); in its onCreate method.
In most cases you can leave this as-is, but you if you want to provide
additional functionality it is fine to subclass or reimplement
FlutterApplication and put your custom class here. -->
<application
android:name="io.flutter.app.FlutterApplication"
android:label="flutter_camera_ml_vision_example"
android:icon="@mipmap/ic_launcher">
<activity
android:name=".MainActivity"
android:name="io.flutter.embedding.android.FlutterActivity"
android:launchMode="singleTop"
android:theme="@style/LaunchTheme"
android:configChanges="orientation|keyboardHidden|keyboard|screenSize|locale|layoutDirection|fontScale|screenLayout|density|uiMode"
android:hardwareAccelerated="true"
android:windowSoftInputMode="adjustResize">
<!-- This keeps the window background of the activity showing
until Flutter renders its first frame. It can be removed if
there is no splash screen (such as the default splash screen
defined in @style/LaunchTheme). -->
<meta-data
android:name="io.flutter.app.android.SplashScreenUntilFirstFrame"
android:value="true" />
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
<meta-data
android:name="com.google.firebase.ml.vision.DEPENDENCIES"
android:name="flutterEmbedding"
android:value="2" />
<meta-data
android:name="com.google.mlkit.vision.DEPENDENCIES"
android:value="barcode" />
</application>
</manifest>

This file was deleted.

5 changes: 2 additions & 3 deletions example/ios/Podfile
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,8 @@ flutter_ios_podfile_setup

target 'Runner' do
flutter_install_all_ios_pods File.dirname(File.realpath(__FILE__))
# because of issue https://github.com/FirebaseExtended/flutterfire/issues/4625#issuecomment-821792539
# need custom git version until it's done
pod 'FirebaseMLVisionBarcodeModel', '0.21', :source => '[email protected]:rozdonmobile/CocoaPodsSpecs.git'

pod 'GoogleMLKit/BarcodeScanning', '~> 2.2.0'
end

post_install do |installer|
Expand Down
Loading