30.10.2016

Android custom voice commands using accessibility

If you tried to implement custom voice commands on Android platform you probably noticed that there are only commands approved by Google that are available to you.
So if you want your application to respond to some custom voice command only option is to use something like: "Start MyApplication" and then on your application start listening to another instance of voice command.
This is of course not usable as we would want cause it adds an extra step and wait time between starting your app and issuing voice command again.
But there is a way to circumvent this, it's not a really nice solution but it works and it seems this is the only way now until google opens up API to use custom voice commands.
The way to do it is to use Accessibility API which as you maybe know can read applications screen so we can intercept speech-to-text api result and call app based upon this result.
First of all define your service inside your AndroidManifest.xml

<service android:name="hr.coreaplikacije.automate.AutoAccessibilityService" android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE">
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService" />
intent-filter>
<meta-data android:name="android.accessibilityservice" android:resource="@xml/accessserviceconfig" />
service>

Then create the accessserviceconfig.xml file and place it inside resources/xml:

<?xml version="1.0" encoding="utf-8"?>
<accessibility-service
xmlns:android="http://schemas.android.com/apk/res/android"
android:accessibilityEventTypes="typeAllMask"
android:accessibilityFeedbackType="feedbackAllMask"
android:notificationTimeout="2000"
android:settingsActivity="hr.coreaplikacije.automate.MainActivity"
android:canRetrieveWindowContent="true"/

And then create the AutoAccessibilityService service class:
package hr.coreaplikacije.automate;

import android.accessibilityservice.AccessibilityService;
import android.util.Log;
import android.view.accessibility.AccessibilityEvent;
import android.view.accessibility.AccessibilityNodeInfo;

import java.util.ArrayList;
import java.util.Timer;
import java.util.TimerTask;

/**
* Created by Dario Filkovic on 23.6.2016..
*/
public class AutoAccessibilityService extends AccessibilityService
{
private ArrayList<String> mFoundWords = new ArrayList<String>();

private String getEventText(AccessibilityEvent event)
{
StringBuilder sb = new StringBuilder();
for (CharSequence s : event.getText())
{
sb.append(s);
}
return sb.toString();
}

@Override
public void onAccessibilityEvent(AccessibilityEvent event)
{
AccessibilityNodeInfo node = event.getSource();
final String str = itterateThroughChildren(node);

if(!str.equals(""))
{
if(mFoundWords.contains(str)) return;
Log.d(Common.TAG, "Found string: "+str);
mFoundWords.add(str);


}
}

private String itterateThroughChildren(AccessibilityNodeInfo node)
{
if(node != null)
{
if(node.getClassName().equals("android.widget.EditText") && node.getPackageName().equals("com.google.android.googlequicksearchbox")) return node.getText().toString();
for(int i=0;i<node.getChildCount();i++)
{
AccessibilityNodeInfo childNodeView = node.getChild(i);
if(childNodeView != null)
{
if(childNodeView.getClassName().equals("android.widget.EditText") && childNodeView.getPackageName().equals("com.google.android.googlequicksearchbox")) return childNodeView.getText().toString();
String test = itterateThroughChildren(childNodeView);
if(!test.equals("")) return test;
}
}
}
return "";
}

@Override
public void onInterrupt()
{

}
}

What this code actually does is go through all the views and if it equals google search box then it prints out found text, after this it's just a matter of implementing some intent action or something else if the text found is the one you are searching for.

Komentari