The objective for my Swift iOS project is to take user's hand gesture drawing and run a match against the 1,000+ monochrome reference images in my library and show user the best matching images in the library with % probability of match in order.
You will be writing a swift 2.1 wrapper for OpenCV 2 ([login to view URL]) iOS framework. I specifically need the FLANN functionality so that I can match user's gesture drawing with about 1,000 png images (sample images are attached with this job posting) in my library and provide the results in % match.
Your resulting code will help me achieve the following:
1. Add OpenCV iOS framework in my existing Swift 2.1 iPhone Project.
2. Drag-Drop your code into the project.
3. Import OpenCV and your code in bridging header or directly in my swift files.
4. Call a swift function passing the array of my library 1,000 + png images (simple monochrome images, each less than 5k in size). Function will return an array of FLANN data for library corresponding the array passed. Processing should take 250 images per second or faster.
5. Call a second swift function, passing (1) the image which user has created using gestures on the screen and (2) the Array of FLANN data created in previous step. Function will return % of feature matching for each of the library array items. Processing should take 250 images per second or faster.
Code should be Xcdeo 7.1 / Swift 2.1 compliant and I should not have to write any additional C++ or Objective-C code. (I only do Swift).
NOTE: See attached samples of kind of monochrome symbol images in the library.
I have used many feature descriptor in opencv.
I implement image retrieval using those feature. I use sift, surt and use several matching method.
If you interest me, i will send you demo.
I already make ready for you fully.
Regard.