Software crafter, digital punker – pylapersonne.info

Tapster — Robot as a Service 🤖

🇺🇸 – Tuesday, August, 28th 2018

Keywords: #Tapster, #automation, #robots, #DIY, #tests

Disclaimer: I have no stocks in Tapster Robotics, Inc. and I wrote this article because I contributed to a project in my personal spare time which to my mind must be more known. I had the chance and the great opportunity to contribute to a free and open source project: the Tapster robot designed by Tapster Robotics, Inc. Because it was rich and exciting, I wanted to write down the experience I had with this project. I won’t talk about the robot’s initialization, but this wiki page is clear and effective.

1 — First thing, why such robot?

When I got my new job, I made some researches about the state of the art of automation and tests tools, quite useful for us — developers. I found a strange but fun project (mainly thanks to this article): a real robot which was able to tap on smartphones screens. The project was also free, open source and open hardware (wow!) and we could be able to build a new one using 3D printers.
However I found something was missing. Basically the robot was driven by a Node.js server which receives HTTP requests, but there were no clients to use so as to send such requests, what a pity. Lucky strike I am a developer, so did I choose to work on apps in my own leisure time so as to enrich this project. Second lucky strike, in my team at Orange Group, we have to create tools in order to test and validate innovative products and services, and I succeeded later in introducing this robot in my team. Here the fun starts!

2 — The core of the bot, it’s server

The basic features of the robot were cool yet. If we dive in the legacy GitHub repository (or this one) we find several basic features such defining 3D position of the robot’s finger, changing its arms angles, tap on coordinates… The problem was the missing of other types of moves we might need for testing purposes. Thus the robot’s server was upgraded with new features: tap several times on a point, double tap, make swipes, tap on random points or draw patterns. The new server was used as a base for the following clients.

3 — {AT | T} Driven Development

When you are worried about the quality of your product, you may set up some processes like ATDD (Acceptance Test Driven Development), TDD (Test Driven Development) or BDD (Behavior Driven Development). If we focus on ATDD or TDD, we can find powerful and efficient tools to write tests. If we are looking for acceptance tests Robot Framework does the job. If we want instrumented tests for mobile apps we can use UI Automator / Espresso, Selenium or Appium. And for unit tests you can get a lot of libraries like JUnit. The thing is Robot Framework uses third party tools (guess what? Appium for example) to process tests on mobile apps. And what if this framework was able to deal with a Tapster robot? Challenge completed, keywords have been implemented. They provide to people in charge of tests several routes to the robot’s API so as to make gestures on apps. And when if we combine these keywords with Appium’s, the level of abstraction allows to make the robot interact with the app-under-test easier. New features have been added like stress taps and stress swipes (useful for monkey tests) and interactions with UI widgets using their ID, texts or Xpath. In other words these keywords helped a lot.

4 — Automation is good, mostly with scripting

DevOps is a nowadays well-known process improving the way of working and the communication between the developers team and the ops team. In order to automate all the tests, integration and validation steps for a product, Jenkins is generally used. One of its benefits is the ability to trigger scripts, e.g. written in Python, so as to executed tasks. Because we wanted to use the robot (i.e. send HTTP requests to its server) without additional glue, a Python client has been implemented. People want to have an interactive mode? No problem. Some of them want to execute always defined commands? Implemented. With this efficient scripting language and the Jenkins scheduler it was possible to automate interactions on a smartphone using the robot.

5 —Think to your colleagues, think GUI...

OK I am a developer so do I work using GNU/Linux-based OS and the console is one of my best friends. But you may have colleagues who are not developers, and they do not use console-based softwares at all, only tools with a lovely and shiny graphical user interface. In addition if I want to animate showcases, graphical clients are sexier. For these reasons Android and Web apps have been implemented and they provide a handy use: to drive remotely the robot through Wi-Fi connections ; quite useful if your computer and the robot are in a corner while you talk to attendees, or drink a pint of beer behind a kakemono.

6 — Hey bot!

Someday I was bored to always trigger the same commands so as to test the features. It was quite annoying to make always the same moves (that is the reason why robots have been created, haven’t they?) and a idea appeared: and what if an assistant was integrated inside the Android app? What if I were able to… talk to the robot? So let’s go to implement an assistant, but which one? For sure Google Assistant… Joking! The chosen one was Snips. Snips is an off-the-grid solution of assistant implementing both ASR (Automatic Speech Recognition) and NLU (Natural Language Understanding). Nothing is done in a foreign cloud you don’t own, in fact nothing is done in a cloud. Operations are proceeded locally, thus the solution is respectful of your private life. You can define several intents for your assistant, train it and also trigger actions.
Once defined you just have to download the assets and use the binaries (dedicated to your platform) in order to process your voice and make the match between what you said and what thing should be done. The solutions is quite incredible, and you have a high degree of customization making GAFAM’s solutions not so exciting. After having defined the french and the english assistants, they have been integrated to the Android app, and the result is quite fun :–) (if you understand french, see this video).

→ And so what?

This project is amazing. Firstly the model 2 of the robot is free, open source, open hardware and not expensive to build.
Licenses in use are MIT and BSD 2-Clause, and several forks (still-alive-or-dead) exist. Secondly even if the model 2 is not enough for you, you can improve it to your needs or use another model: the Tapster Sidekick (see this video, this one and this). Then, the robot helped for certain test cases. In fact common testing softwares can fail to trigger events (tap, swipe, …) on secured elements like SIM card-displayed-widgets. With more and more secured apps we might need hardware-based tools to complete tests and enhance the quality workflows.

But to my mind a tricky part about the robot is the calibration. Indeed the robot is agnostic: it doesn’t care about the model of smartphone you are using (or tablet) or its OS. In fact if you say to the robot to tap to (x,y), it will tap to (x,y). That’s all. (x,y) might be a point outside the screen, or a different point you want to click on. That is the reason why you have to calibrate the robot according to a device model, and that point has also been improved.

However a regret can remain about Appium.

Few years ago the version 1.4 of Appium was able to deal with a Tapster robot. But with the massive refactoring (based on axes and chain saws) of Appium with its version 1.5 the code dealing with the robot has been withdrawn, and its milestone deprecated.
Thus it could be cool to integrate again the robot management in this tool… who’s on it?

If you are looking for resources, feel free to follow these links:



Last update: Tuesday, May, 23th 2023
Previously on Medium and paper.wf

Did you enjoy reading this blog?
Give me a beer 🍺 or use something else ❤️‍🔥
Licensed under CC-BY-SA 4.0.
Opinions are my own.
To contact me, feel free to choose the most suitable medium for you, or for example Mastodon.