In this guide, we will explore the Document Scanner features of the Dynamsoft Capture Vision SDK.
- Supported Version: 0.71.0 or higher
- Supported OS: Android 5.0 (API Level 21) or higher.
- Supported ABI: armeabi-v7a, arm64-v8a, x86 and x86_64.
- Development Environment: Android Studio 2022.2.1 or higher.
- Supported OS: iOS 13+.
- Supported ABI: arm64 and x86_64.
- Development Environment: Xcode 13+ (Xcode 14.1+ recommended).
- Node: 18 or higher
Run the following commands in the root directory of your react-native project to add dynamsoft-capture-vision-react-native into dependencies
# using npm
npm install dynamsoft-capture-vision-react-native
# OR using Yarn
yarn add dynamsoft-capture-vision-react-nativethen run the command to install all dependencies:
# using npm
npm install
# OR using Yarn
yarn installFor iOS, you must install the necessary native frameworks from CocoaPods by running the pod install command as below:
cd ios
pod installThe Dynamsoft Capture Vision SDK needs the camera permission to use the camera device, so it can capture from video stream.
For Android, we have defined camera permission within the SDK, you don't need to do anything.
For iOS, you need to include the camera permission in ios/your-project-name/Info.plist inside the <dict> element:
<key>NSCameraUsageDescription</key>
<string></string>
Now that the package is added, it's time to start building the document scanner component using the SDK.
The first step in code configuration is to initialize a valid license via LicenseManager.initLicense.
import {LicenseManager} from 'dynamsoft-capture-vision-react-native';
LicenseManager.initLicense("your-license-key")
.then(()=>{/*Init license successfully.*/})
.catch(error => console.error("Init License failed.", error));Note
- The license string here grants a time-limited free trial which requires network connection to work.
- You can request a 30-day trial license via the Request a Trial License link.
Before opening camera to start document scanning, you need to request camera permission from system.
import {CameraEnhancer} from 'dynamsoft-capture-vision-react-native';
CameraEnhancer.requestCameraPermission();The basic workflow of scanning a document from video stream is as follows:
- Initialize the
CameraEnhancerobject - Initialize the
CaptureVisionRouterobject - Bind the
CameraEnhancerobject to theCaptureVisionRouterobject - Register a
CapturedResultReceiverobject to listen for scanned document via the callback functiononProcessedDocumentResultReceived - Open the camera
- Start document scanning via
startCapturing
import React, {useEffect, useRef, useState} from 'react';
import {
CameraEnhancer,
CameraView,
RecognizedTextLinesResult,
CaptureVisionRouter,
EnumPresetTemplate,
ParsedResult, ProcessedDocumentResult, imageDataToBase64
} from 'dynamsoft-capture-vision-react-native';
export function Scanner() {
const cameraView = useRef<CameraView>(null); // Create a reference to the CameraView component using useRef.
const camera = CameraEnhancer.getInstance(); //Get the singleton instance of CameraEnhancer
const router = CaptureVisionRouter.getInstance(); //Get the singleton instance of CaptureVisionRouter
useEffect(() => {
router.setInput(camera); //Bind the CaptureVisionRouter and ImageSourceAdapter before router.startCapturing()
camera.setCameraView(cameraView.current!!); //Bind the CameraEnhancer and CameraView before camera.open()
/**
* Adds a CapturedResultReceiver object to listen the captured result.
* In this sample, we only listen onProcessedDocumentResultReceived generated by Dynamsoft Document Normalizer module.
* */
let resultReceiver = router.addResultReceiver({
//If start capturing with EnumPresetTemplate.PT_DETECT_AND_NORMALIZE_DOCUMENT,
//ProcessedDocumentResult will be received on this callback.
onProcessedDocumentResultReceived: (result: ProcessedDocumentResult) => {
//Handle the `result`.
if (result.deskewedImageResultItems && result.deskewedImageResultItems.length > 0) {
let deskewedImageBase64 = imageDataToBase64(result.deskewedImageResultItems[0].imageData)
//...
}
},
});
/**
* Open the camera when the component is mounted.
* Please remember to request camera permission before it.
* */
camera.open();
/**
* Start capturing when the component is mounted.
* In this sample codes, we start capturing by using EnumPresetTemplate.PT_DETECT_AND_NORMALIZE_DOCUMENT template.
* */
router.startCapturing(EnumPresetTemplate.PT_DETECT_AND_NORMALIZE_DOCUMENT);
return () => {
//Remove the receiver when the component is unmounted.
router.removeResultReceiver(resultReceiver);
//Close the camera when the component is unmounted.
camera.close();
//Stop capturing when the component is unmounted.
router.stopCapturing();
}
}, [camera, router, cameraView]);
return (
<CameraView style={{flex: 1}} ref={cameraView}>
{/* you can add your own view here as the children view of CameraView */}
</CameraView>
);
}If you want to detect document boundary and adjust the boundary manually, you can startCapturing with EnumPresetTemplate.PT_DETECT_DOCUMENT_BOUNDARIES template.
The ProcessedDocumentResult will then be received through the onProcessedDocumentResultReceived callback.
You can use the Editor component to learn how to draw ProcessedDocumentResult.detectedQuadResultItems on the original image and interactively edit the quads.
Go to your project folder, open a new terminal and run the following command:
# using npm
npm run android
# OR using Yarn
yarn android- Open the workspace file
*.xcworkspace(not .xcodeproj) from theiosdirectory in Xcode. - Adjust Provisioning and Signing settings.
# using npm
npm run ios
# OR using Yarn
yarn iosIf everything is set up correctly, you should see your new app running on your device. This is one way to run your app — you can also run it directly from within Android Studio and Xcode respectively.
Note
If you want to run Android via Windows, You may encounter some build errors due to the Windows Maximum Path Length Limitation.
Therefore, we recommend that you move the project to a directory with a shorter path.
The full sample code is available here.
How to enable new architecture in Android
How to enable new architecture in iOS
- You can request a 30-day trial license via the Request a Trial License link.