Saturday 23 August 2014

Android Touch Gestures Capturing Interface

Introduction

In this article we will look at android application to capture touch gestures.This module is first part  of generic touch based gesture recognition library.

Background
Gesture is a prerecorded touch screen motion sequency.Gesture recognition is an active research area in the field of pattern recognition,image analysis and computer vision.
We will have several modes in which application can operate.One of options the user can select is to capture and store the candidate gesture.
The aim would be to build a generic C/C++ library that can store gestures is a user defined format.

Gesture Registration Android Interface

This process of capturing and storing information about candidate gesture classes  is called gesture registration. 

In the present article we will use the GestureOverlay method.A gesture overlay acts as a simple drawing board on which the user can draw his gestures. The user  can modify  several visual properties, like the color and the width of the stroke used to draw gestures, and register various listeners to follow what the user is doing.

To capture gestures and process them first stem is to add a GestureOverlayView to store_gesture.xml XML layout file.


......
......



    
 

        
 



Some properties of gesture overlay are specified that the gestures stroke type is single indicating uni-stroke gestures.

Now in the main activity file we just need to set the content view to the layout file.In the present application the name of layout file is "activity_open_vision_gesture.xml".

Since we also need to capture or process the gesture once they are performed we add a gesture listerner to the overlay.The most commonly used listener is GestureOverlayView.

OnGesturePerformedListener which fires whenever a user is done drawing a gesture.We use a class GesturesProcessor that implements the GestureOverlayListner.

Once the gesture is drawn by the user the control flow enters the onGestureEnded method.Here we copy the gesture   and can perform host of activities like storing,predicting etc.

Below is a image of the UI Interface


private class GesturesProcessor implements GestureOverlayView.OnGestureListener {
        public void onGestureStarted(GestureOverlayView overlay, MotionEvent event) {
            mDoneButton.setEnabled(false);
            mGesture = null;
        }

        public void onGesture(GestureOverlayView overlay, MotionEvent event) {
        }

        //callback function entered when the gesture registration is completed
        public void onGestureEnded(GestureOverlayView overlay, MotionEvent event) {
           //copy the gesture to local variable
            mGesture = overlay.getGesture();
          //ignore the gesture if length is below a threshold
            if (mGesture.getLength() < LENGTH_THRESHOLD) {
                overlay.clear(false);
            }
          //enable the store button
            mDoneButton.setEnabled(true);
        }

        public void onGestureCancelled(GestureOverlayView overlay, MotionEvent event) {
        }
    }


Upon clicking the store button the program enters "onStore" callback function.

    public void addGesture(View v) {
     Log.e("CreateGestureActivity","Adding Gestures");
        //function which extracts information from the Android Gesture Objects like locations and then make native library function calls to store the gesture
     extractGestureInfo();
    }


We define all the JNI Interface functions in the class GestureLibraryInterface.We define 2 JNI Interface calls to the native C/C++ gesture library

public class GestureLibraryInterface {
    static{Loader.load();}
    //makes native calls to GestureLibrary to store gesture information in local filesystem
    public native static void addGesture(ArrayList%lt;Float> location,ArrayList time,String name);
    //make native calls to GestureLibrary to set the gesture directory
    public native static void setDirectory(String name);
}

The first step is to set the directory where the Gestures will be stored by making the "setDirectory" call.This is done when the AndroidActivity is initialized in the "onCreate" function.

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        
        setContentView(R.layout.store_gesture);

        mDoneButton = findViewById(R.id.done);
        eText = (EditText) findViewById(R.id.gesture_name);
        GestureOverlayView overlay = (GestureOverlayView) findViewById(R.id.gestures);
        overlay.addOnGestureListener(new GesturesProcessor());
        GestureLibraryInterface.setDirectory(DIR);
    }

The extractGestureInfo reads the gesture strokes and stores the locations in ArrayList which is passed to native C/C++ using JNI Interface.

.....
private static final String DIR=Environment.getExternalStorageDirectory().getPath()+"/AndroidGesture/v1";
....
      @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        //load the store activity GUI layoyt files
        setContentView(R.layout.store_gesture);
        //gets the store button object
        mDoneButton = findViewById(R.id.done);
        //get the EditText object
        eText = (EditText) findViewById(R.id.gesture_name);
        //configures the gestureOverLay Listener
        GestureOverlayView overlay = (GestureOverlayView) findViewById(R.id.gestures);
        overlay.addOnGestureListener(new GesturesProcessor());
        //gets the gesture directory  
        GestureLibraryInterface.setDirectory(DIR);
    }

The JNI C/C++ codes associated with the java class are defined in the file GestureLibraryInterface.cpp and GestureLibraryInterface.hpp files

//function calls the GetureRegognizer methods to add gesture to class path
JNIEXPORT void JNICALL Java_com_openvision_androidgesture_GestureLibraryInterface_addGesture(JNIEnv *, jobject, jobject, jobject, jstring);

//function calls the GestureRecognizer methods to set the main gesture directory path
JNIEXPORT void JNICALL Java_com_openvision_androidgesture_GestureLibraryInterface_setDirectory(JNIEnv *, jobject, jstring);

//Utility functions to convert from jobject datatype to float and Long
float getFloat(JNIEnv *env,jobject value);
long getLong(JNIEnv *env,jobject value);


The UniStrokeGesture Library consits of the following files
  • UniStrokeGestureLibary
  • UniStrokeGesture
  • GesturePoint
The UniStrokeGestureLibrary Class encapsulates all the properties of Unistroke gestures.It contains methods for storing,retriving and predicting the gestures amongst others.

The "addGesture" JNI Method calls the save routine implemented in the class to Store the Gestures
The UniStrokeGestureLibrary consists of a sequence of objects of type UniStrokeGesture.

The objects of class UniStrokeGesture encapsulates all the properties of single gesture class.UniStrokeGesture Class contains facility to store multiple instances of sample gesture,as UniStroke Gesture can be represented by multiple candidated instances. 

The UniStrokeGesture is contains as sequence of objects of type GesturePoint.Each Gesture points represents a element of UniStrokeGesture and is characterized by its location in 2D grid.

/**
 *  function that stores the Gesture to a specified directory
 */
void UniStrokeGestureRecognizer::save(string dir,vector points)
{

    char abspath1[1000],abspath2[1000];
    sprintf(abspath1,"%s/%s",_path.c_str(),dir.c_str());
   
    int count=0;

    //check if directory exists else create it
    int ret=ImgUtils::createDir((const char *)abspath1);

    count=ImgUtils::getFileCount(abspath1);

     sprintf(abspath2,"%s/%d.csv",abspath1,count);
     //writing contents to file in csv format
     ofstream file(abspath2,std::ofstream::out);
     for(int i=0;i     //creating a bitmap while storing the CSV file
generateBitmap(abspath2);   }

Consider an example of gesture stored in CSV format



The generateBitMap fuction loads the gesture points from the input csv file and generates a bitmap image that is suitable for display

void UniStrokeGestureRecognizer::generateBitmap(string file)
{
   
   string basedir=ImgUtils::getBaseDir(file);
   string name=ImgUtils::getBaseName(file);
   string line;
   //ifstream classFile(file.c_str());

   
   float x,y;
   cv::Mat image=cv::Mat(640,480,CV_8UC3);
   image.setTo(cv::Scalar::all(0));
   Point x1,x2,x3;
   int first=-1;
   int delta=20;
   vector points;
   //loading the gesture from CSV file
   points=loadTemplateFile(file.c_str(),"AA");

   //getting the bounding box
   Rect R=boundingBox(points);
   
   //drawing the gesture
   int i=0;
   cv::circle(image,cv::Point((int)points[i].position.x,(int)points[i].position.y),3,cv::Scalar(255,255,0),-1,CV_AA);
   for(i=1;iimage.cols)
       R.width=image.cols-R.x-1;
   if(R.y+R.height>image.rows)
       R.height=image.rows-R.y-1;

   //extract the ROI
   Mat roi=image(R);
   Mat dst;
   cv::resize(roi,dst,Size(640,480));
   string bmpname=basedir+"/"+name+".bmp";
   //save the bitmap
   cv::imwrite(bmpname,dst);



}

 

Display the Gesture List

The next part of the application deals with displaying the gesture created in the above section on the Android UI.

Upon starting the application all the bmp files  in the template directory are loaded

The activity of loading the gestures the bitmaps is done in background asynchronously.

In android a ListView is used to display the gesture bitmaps and associated text.

Layout each item of the list is defined in the file gesture_item.xml



The layout for ListView is defined in the main layout file activity_open_vision.xml

The displayGestures function defined in OpenVisionGesture.java.

A object of type "AsyncTask",GesturesLoadTask is defined in the main class files.This methods of these classes is called from the displayGestures function,which loads the gesture list in the background.

The objects of class ArrayTask reuire three methods to be defined.
  • doInBackground - main function executed in the background
  • onPreExecute - the function invoked before background task  is executed
  • onPostExecute - the function invoked after the background task has executed
  • onProgressUpdate - the function can be used to update UI contents while background task is executing
In the background task ,the code parses throught the gesture template directory and reads all the bitmap files.

An ArrayAdapter  takes an Array and converts the items into View objects to be loaded into the ListView container.We define an adapter which maintains a NamedGesture objects which contain the gesture name and identifier.The "getView" function in the ArrayAdapter class is responsible for converting the java object to View.

we maintain a List of Bitmaps identified by a ID as well as List of Gesture names represented by the SameID.

When ever a bitmap is read we update the lists and GUI so that it can be displayed by calling the "publishProgress" function which leads to onProgressUpdate function being called in the main UI thread.

Using this approach we can see the bitmaps being populated with time
        @Override
        protected Integer doInBackground(Void... params) {
            if (isCancelled()) return STATUS_CANCELLED;
            if (!Environment.MEDIA_MOUNTED.equals(Environment.getExternalStorageState())) {
                return STATUS_NO_STORAGE;
            }

            

            Long id=new Long(0);
         File list = new File(CreateGestureActivity.DIR);
         
         //get list of template classes
         File[] files = list.listFiles(new DirFilter());
         
         for(int i=0;i  //get list of image files in the template folder
File[] list1=files[i].listFiles(new ImageFileFilter()); for(int k=0;k //load the image files BitmapFactory.Options options = new BitmapFactory.Options(); options.inPreferredConfig = Bitmap.Config.ARGB_8888; Bitmap bitmap = BitmapFactory.decodeFile(list1[k].getPath(), options); Bitmap ThumbImage = ThumbnailUtils.extractThumbnail(bitmap, mThumbnailSize, mThumbnailSize); final NamedGesture namedGesture = new NamedGesture(); namedGesture.id=id; namedGesture.name = files[i].getName()+"_"+list1[k].getName(); //add bitmap to hashtable mAdapter.addBitmap((Long)id, ThumbImage); id=id+1;   //update the GUI publishProgress(namedGesture); bitmap.recycle(); } } return STATUS_SUCCESS; }


Once the add function in adapter is called ,the UI ListView is updated by displaying the gesture name and associated bitmap in the "getView" function

        @Override
        public View getView(int position, View convertView, ViewGroup parent) {
            if (convertView == null) {
                //view associated with individual gesture item
                convertView = mInflater.inflate(R.layout.gestures_item, parent, false);
            }
            //get the gesture at specified position in the listView
            final NamedGesture gesture = getItem(position);
            final TextView label = (TextView) convertView;

            //set the gesture names
            label.setTag(gesture);
            label.setText(gesture.name);
            //get the bitmap from hashtable identified by id and display bitmap to left of text
            label.setCompoundDrawablesWithIntrinsicBounds(mThumbnails.get(gesture.id),null, null, null);

            return convertView;
        }

Code

The files found in the ImgApp Directory form the OpenVision repository and can be found at github repository www.github.com/pi19404/OpenVision.

The complete android project can be found at samples/Android/AndroidGestureCapture directory in OpenVision repository.This is android project source package and can be directly imported onto eclipse and run directly.The application was tested on Mobile device with Android Version 4.1.2.Compatibility with other Android OS versions has not been tested or kept in consideration while developing the application.

You need to have OpenCV installed on you system.The present application was developed on Ubuntu 12.04 OS.The paths in the Android.mk are specified based on this.For windows or other OS or if OpenCV paths are different modify the make file accordingly

The apk and source File Can be downloaded from

Wednesday 20 August 2014

Compiling Native C/C++ library for Android

Introduction

This article describes method to cross compile C/C++ library for mobile devices which use Android OS.

Installation and Code Compilation

Before Proceeding make sure that you have all the below software components installed and configured in Eclipse
  • Eclipse IDE
Develop the code on Desktop Computer and check if you are able to compile it properly without errors.
The present example consists of files containing following classes
  • UniStrokeGestureRecognizer
  • UniStrokeGesture
  • GesturePoint
The library libOpenVision.so has been successfully compiled on the Ubuntu OS and now we proceed with cross compilation of the library for ARM based mobile devices which use the Android OS.
Cross Compilation
The simplest approach to do this is to use the Eclipse IDE.The Eclipse IDE provides features for adding native C/C++ support to an existing Android based project.
The  project name is AndroidGesture.Right click on an Android project and
select Android Tools -> Add native support.
And enter the desired library name as OpenVision
This will configure the AndroidProject for the native build.Create a jni folder with OpenVision.cpp file and associated Android.mk make file
Copy the all the C/C++ project files in the jni folder and then proceed to modify the Android.mk file to configure for native build.
Create a directory called OpenVision in the jni directory
Copy all of the following files in the ImageApp subdirectory
  • UniStrokeGestureRecognizer.cpp
  • UniStrokeGestureRecognizer.hpp
  • UniStrokeGesture.cpp
  • UniStrokeGesture.hpp
  • GesturePoint.cpp
  • GesturePoint.hpp
Copy the file OpenCVCommon.hpp in the Common Subdirectory
The preset code uses OpenCV libraries.Copy the attached OpenCV pre-compiled libraries for ARM in the libs/armeabi and libs/armeabi2 directories.
MakeFiles
below are the contents of Android.mk file.This file is like a standard make file containing the include paths,source files,library dependencies etc.Few of the syntaxes are specific to android build and explanation is provided in the comments
Android.mk file
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)

# name of the library to be built
LOCAL_MODULE    := OpenVision
#list of source files to be build as part of the library
LOCAL_SRC_FILES := ImgApp/GesturePoint.cpp ImgApp/UniStrokeGesture.cpp ImgApp/UniStrokeGestureRecognizer.cpp

# list of dependent 3rd party or external libraries are included in the LOCAL_SHARED_LIBRARY variable
LOCAL_SHARED_LIBRARIES := $(foreach module,$(OPENCV_LIBS3),opencv_$(module)) 
OPENCV_MODULES3:=core imgproc flann contrib features2d video  highgui legacy ml objdetect  
OPENCV_LIBS3:=$(OPENCV_MODULES3)

# list of dependent system libraries
LOCAL_LDLIBS +=  -fPIC -llog -ldl -lm  -lz -lm -lc -lgcc -Wl,-rpath,'libs/armeabi-v7a' 
LOCAL_LDLIBS += -L$(LOCAL_PATH)/../libs/armeabi -llog -Llibs/armeabi-v7a/ 

# include path for header files for C and C++ applications
LOCAL_C_INCLUDES +=/usr/local/include /usr/local/include/opencv2 /home/pi19404/repository/OpenVision/OpenVision
LOCAL_CPP_INCLUDES +=/usr/local/include /usr/local/include/opencv2 /home/pi19404/repository/OpenVision/OpenVision
#The compilation flags for C/C++ applications
LOCAL_CPPFLAGS += -DHAVE_NEON -fPIC -DANDROID -I/usr/local/include/opencv  -I/usr/local/include   -I/OpenVision -I/home/pi19404/repository/OpenVision/OpenVision -fPIC
LOCAL_CFLAGS += -DHAVE_NEON -fPIC -DANDROID  -I/usr/local/include/opencv -I/usr/local/include -I/OpenVision -I/home/pi19404/repository/OpenVision/OpenVision -fPIC
LOCAL_CPP_FEATURES += exceptions

#statement specifies build of a shared library
include $(BUILD_SHARED_LIBRARY)

#files in the libs/armeabi are deleted during each build
#we need to have 3rd party opencv libraries in this directory
#the files are placed in the armeabi2 directory
#when ever a native build is trigged the opencv library files specified in the OPENCV_MODULES2
#variable are copied from the armeabi2 directory to the armeabi or armeabi-v7a directory
#as per the specification of APP_ABI in the Application.mk file

include $(CLEAR_VARS)
OPENCV_MODULES2:= calib3d contrib  core features2d flann highgui imgproc  legacy ml nonfree objdetect photo stitching  video videostab
OPENCV_LIBS2:=$(OPENCV_MODULES2)  
OPENCV_LIB_SUFFIX:=so
OPENCV_LIB_TYPE:=SHARED


define add_opencv_module1 
include $(CLEAR_VARS)
 LOCAL_PATH := libs/armeabi2
 LOCAL_MODULE:=aaaopencv_$1
 LOCAL_SRC_FILES:=libopencv_$1.$(OPENCV_LIB_SUFFIX)
 include $(PREBUILT_$(OPENCV_LIB_TYPE)_LIBRARY) 
endef

$(foreach module,$(OPENCV_LIBS2),$(eval $(call add_opencv_module1,$(module))))

Application.mk make file
APP_ABI :=   armeabi-v7a armeabi
APP_STL := gnustl_static
APP_PLATFORM    := android-8
APP_CPPFLAGS := -frtti -fexceptions -ftree-vectorize  -mfpu=neon -O3 -mfloat-abi=softfp -ffast-math

After building the project the libOpenVision.so files can be found in the libs/armeabi and libs/armeabi-v7a directories.These have been cross-compiled for use on android based devices.
These can now be loaded and called from java application using JNI Interface

Files

The pre compiled opencv libraries for Android can be found at www.github.com/pi19404/OpenCVAndroid
The source and make files used above came be found in the OpenVision repository at www.github.com/pi19404/OpenVision
The Android.mk and Application.mk files and contents of jni directory can be found below