I have developed an Android App that has full touch support on any standard Android Device. Also the (simulated) touch input on an Android TV Virtual Device works without flaws.
After I transferred the App to my Echo Show 15 (Fire OS 7.5.6.1) via LAT I noticed that the App doesn’t receive any touch events and the virtual remote overlay is shown. Connecting a Bloototh mouse didn’t help either - no click events were received.
I enabled/disabled different features in the AndroidManifest.xml like (android.hardware.touchscreen, android.software.leanback, …) nothing helped. I guess that these feature flags have no effect on touch support at all and only serve as a device filter for the Amazon Appstore.
I already found this tutorial Add touch to Fire TV | Amazon Fire TV but it doesn’t help me because my App doesn’t use the native Android GUI elements and input event handler. Instead it uses the Qt Framework. I don’t know exactly how the Android platform abstraction of Qt works but I think only a SurfaceView is created where the GUI is rendered on via OpenGL ES 2.0. The touch events (and input events in general) get intercepted in the Activity class and get passed through to the Qt event loop.
The question is, how does the Echo Show 15 (or Fire OS in general) recognise that the App has touch support? I assume this is decided during the runtime of the App (and not during compile time). But what technical criteria must be met? If only I knew them, I could patch Qt and maybe get it to work. Unfortunately, the ADB interface has been disabled on the Echo Show 15. So there is no way to examine logcat to find a clue as to what is preventing touch input in my App.
Btw is there even a single App/Game in the Amazon Fire TV Appstore that has full touch support? I haven’t found a single one.
I created a minimal App. Still no touch support. What am I missing?
package com.github.tereius.habview;
import android.os.Bundle;
import android.app.Activity;
import android.view.View;
import android.widget.TextView;
import android.widget.Button;
public class MainActivity extends Activity
{
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
TextView button = new Button(this);
button.setText("Hello world!");
button.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
button.setText("Hello again");
}
});
setContentView(button);
}
@Override
protected void onDestroy()
{
super.onDestroy();
}
}
<?xml version="1.0"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.github.tereius.habview"
android:installLocation="auto"
android:versionCode="1001103"
android:versionName="1.0.11">
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.BLUETOOTH"/>
<uses-feature android:name="android.hardware.touchscreen" android:required="false"/>
<uses-feature android:name="android.software.leanback" android:required="false"/>
<application android:supportsRtl="true">
<activity
android:name="com.github.tereius.habview.MainActivity"
android:label="HabView"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
</application>
</manifest>
If I remove the bluetooth permission <uses-permission android:name="android.permission.BLUETOOTH"/>
from the Manifest I can enable the “touch capabilities” in the LAT.
Those two lines are not shown if the bluetooth permission is declared in the Manifest. Is this a bug in the Appstore?
My last attempt: I enabled “touch capability” in the LAT, compiled the official sample touch app and added the APK to the LAT. Updated the app on my Echo Show 15. Nevertheless, I cannot use touch input and the virtual remote control is displayed.
Running in the same issue. Have you found a solution?
At the moment it does not seem possible to enable the touch capability for the Echo Show 15. I am in contact with Amazon Developer Support, but they are very slow to respond. This is the correspondence so far:
Amazon Support:
Thank you for contacting us! At the moment, Echo Show 15 is not enabled for development mode or debugging, and certain features are limited. In addition to that, it will display a “Limited Touch Functionality” overlay on third party apps. Your users should still be able to navigate and use your app though. Apologies it does not completely solve the issue with the overlay, but I hope that answers your question. Thanks!
My follow-up question:
Did I understand that correctly: Third party apps (like mine) can only be operated with the (virtual) remote control and not via touch input, even if the App was developed with full touch support (like described here Add touch to Fire TV | Amazon Fire TV)? So the only Apps that support touch input on the Echo Show 15 are Amazon’s own apps such as “prime video” or “amazon silk browser”? Can you confirm this please.
Amazon Support:
I am discussing this with the relevant internal team, and will get back to you as soon as I have a definitive answer. Thank you!
I received this answer on 08 April. Still waiting for an answer…
If it turns out that Amazon favors its own apps over third-party apps (by deliberately disabling touch input for third-party apps), this probably violates the EU’s Digital Markets Act.
And I thought I was the only one who had this long response time with Amazon support. I haven’t received a reply to my tickets for about 2 months.
Hi,
Is there any update on this? I am having the same issue with ADB debugging via echo show.
I am porting over an android app for my smart home alarm and cameras but I can’t seem to do any testing.
This device is the complete opposite of what a smart home device should be. They are breaching EU’s Laws
Yes, this is the typical gatekeeper behavior we are seeing here: artificially prevent third-party applications from exploiting the full potential of the device. Only the gatekeeper’s apps are allowed to do this.
I have not received an answer from Amazon support and will probably never receive an answer because of this question: “[…] the only Apps that support touch input on the Echo Show 15 are Amazon’s own apps such as “prime video” or “amazon silk browser”? Can you confirm this please.”. If they were to answer “yes” to this question, they would be admitting that they are in breach of the EU digital markets act.
I would recommend anyone who is unhappy with the situation to file a complaint here https://appfairness.org. There are not many more options except, of course, to go to court and sue Amazon.