Saturday, February 11, 2017

About last year...

It has been a while since I last posted on this blog, so I guess it is time for me to update it again and register some of the things I have done last year (2016).

This past year was a little different for me than the previous ones, as I have tried to do things a little differently. Instead of working full-time in a company, I have decided to focus on my own projects, while working part-time as a freelance developer and while studying as a graduate special student at University of Sao Paulo.

Personal projects

Thanks to this change I had the time to work on apps like inSPorte, which allows people from Sao Paulo to see where the buses are in real-time, and it also allows them to rate and report problems with these buses.

Unfortunately this app hasn't grown as much I expected it to, but this experience has taught me that developing an app is only part of the story and that a lot more has to be done for it to be really successful. It has to be well advertised, it has to have a well defined business model, etc...

Off-topic: Since I was mentioning a public transit app, let me announce that I have just released a simple app called Nexbus that allows people from Dublin to check when the next bus is arriving in a bus stop.

Freelance projects

As a freelance developer I have been working on projects such as Bike da Firma (now Bora Bike) and SportsToGo. During last year I have also worked on a virtual reality app called Honda VR Experience.

Honda VR Experience

The car maker Honda was about to release a new version of their Honda Civic car in Brazil, so they ordered a VR app to promote this new vehicle. This app was meant to present a virtual test drive of the car (in the form of a 360º video) before it was released to the public. For this campaign, Honda bought around 250 Samsung S7 + Gear VR, installed this app on each of these devices, and then sent these devices to every Honda car dealership in Brazil. Needless to say, Honda had a very big budget.

Some frames of the 360º video and scenes from the app
The app as viewed on the Gear VR

Configuring and installing the app on multiple devices was also part of the job

That was my first time working on a commercial VR app, and it was also my first time working with an advertisement agency (Jüssi) and an animation studio (Big Studios). My role in this project was to develop the Unity app for the Gear VR: writing scripts for the scenes, UI elements, controls, and for the playback of the 360º video. However, Jüssi and Big Studios were the ones who produced all the visual assets, including the 360º video. This app also used a special plugin from Two Big Ears for the playback of the 3D immersive audio produced by Upmix.

I definitely had some fun working with VR

Graduate studies

I studied "Artificial Intelligence" and "Laboratory of Image Processing and Computer Vision" from August to December. While studying "Artificial Intelligence" I implemented some classic AI algorithms in Python, such as heuristic search (A*), Minimax, decision tree classifier (Hunt's algorithm), MDP (value iteration and Q-Learning). However, the Computer Vision course required us to do some real research and to come up with new ideas to try to improve some state-of-the-art algorithms for problems of this field. Clearly, one semester is not enough time to achieve real improvement, however we have got some interesting results in the problems we have tackled.

This course had only two projects, the first one required us to use the Image-Forest Transform algorithm (which is a generalization of Dijkstra's algorithm for finding shortest-paths in a graph) together with other image processing techniques to create a program capable of correctly identifying connections, drawn by a person on a sheet of paper, between a set of dots. The idea was to use this program as a tool for a smarter automatic grader, allowing students to answer multiple choice questions by connecting dots instead of filling squares.

One of the great challenges of this project was to create a path-cost function for the Image-Forest Transform that could correctly detect overlapping connections. In my implementation, darker paths would have a smaller cost than lighter paths and smoother paths would have a smaller cost than paths with an elevated curvature (i.e. with sharp turns), so these smoother and darker paths would be favoured in regions with overlap. This option works well in most cases, however it can fail when the person only draws connections with sharp turns instead of smooth ones. A solution for this problem would be to use this path-cost function that also analyses the curvature only in regions with overlap, instead of using it for the whole image, so that regions with no ambiguity that have sharp turns would only have its brightness analysed and not be affected, but to do that it is necessary to create another algorithm capable of finding regions where connections overlap.

One does not need to know Portuguese to understand that there's a lot to be improved
The second project required us to use a "3D version" of the Image-Forest Transform algorithm or a Graph Cut (Max-flow Min-cut algorithm) to implement a program that performs a "skull stripping" (i.e. automatic segmentation of the brain from 3D MRI images). In my case I have used both algorithms, so that I could compare both runtime and quality of the segmentation generated with each algorithm.

In simple terms, the Max-flow Min-cut algorithm runtime depends on the weights of the graph, so for 3D images with the same number of voxels it can still vary a lot, however it is capable of generating a segmentation with smoother edges when compared to the Image-Forest Transform. 

The first step of the program finds a region of the image with high average brightness and low variance

Get in touch with me if you are interested in reading my detailed report (in Portuguese) on any these projects.

That's all for now! Hopefully I will post again before the end of this year.

Monday, April 25, 2016

"Just don't crash" is now available for iOS

Although I have developed other games before, this is my first game on the App Store! Developing this game was not something that I was planning, but rather something that just happened. I was given four days to develop a racing game using HTML Canvas and Javascript for a job interview, and this is what I have developed:

Unfortunately I was not hired, and I was told that I should not publish the source code of the game, because the company might ask other candidates to build similar games. So, I have decided to port it to iOS, and this is the result:

I also had to replace some of the game assets that were probably not free to use, but the game still has the same dynamic.

About the game

This game has only one rule: don't crash. There is no winning, you will eventually crash. However, the main goal is to go as far as you can without crashing. You can download it for free on your iPhone, iPad or iPod Touch: App Store, but be warned, this game is difficult and really addictive!

Just don't crash will soon be available for the Apple TV as well.

P. S: I would like to thank OpenGameArt and its community, most the game's assets came from that website.

EDIT: Video:

Thursday, March 24, 2016

HoloViewer App

I have been developing a new app that uses the same "Hologram" concept that I used on HoloGlobo last year. It will be available on the Apple App Store soon!

With this app you will be able to easily turn your favourites 3D models into "holograms", watch the video below to see how it works:

EDIT: HoloViewer is now available on the App Store! Download

Sunday, February 7, 2016

Weekend project: Apple Watch as a TV Remote (using Arduino, ESP8266 and Infrared LED)

Have you ever thought about using your Apple Watch to control your TV (and potentially other home appliances)? Well I have, and this is what I built this weekend:

Arduino Due + ESP8266 + IR LED
(I could have programmed the ESP8266 directly and used one of its GPIOs to control the IR LED, instead of using an Arduino)

Apple Watch

Interested? Check out the Arduino sketch below, or go to GitHub to download this demo project.

 * Arduino Due + ESP8266 + IR emitter
 * created by Fabio de Albuquerque Dela Antonio
 * based on this example:

/* Uses Arduino-IRremote-Due ( */
#include <IRremote2.h>

/* IR LED on pin 7 */
IRsend irsend;

#define DEBUG true

/* Uses Serial1 for the ESP8266 */
#define esp8266 Serial1

void setup() {


  /* Using 115200 as the baud rate, yours may be different */


  /* AP mode, you'll need to connect to the ESP8266 AP to communicate with it */


  /* TCP/IP server on port 1337 ( is the IP usually) */

void loop() {

  if(esp8266.available()) {
    if(esp8266.find("+IPD,")) {

      char buffer[128];
      memset(buffer, 0, 128);
      esp8266.readBytesUntil(',', buffer, 128);
      int connectionId = atoi(buffer);

      memset(buffer, 0, 128);
      esp8266.readBytesUntil(':', buffer, 128);
      int length = atoi(buffer);

      memset(buffer, 0, 128);
      sprintf(buffer, "Connection: %d Length: %d", connectionId, length);
      memset(buffer, 0, 128);
      esp8266.readBytesUntil('(', buffer, 128);
      Serial.print("Command: ");

      /* sendIR(<code>) command */
      if(strcmp(buffer, "sendIR") == 0) {

        memset(buffer, 0, 128);
        esp8266.readBytesUntil(')', buffer, 128);
        unsigned long value = strtoul(buffer, NULL, 10);

        memset(buffer, 0, 128);
        sprintf(buffer, "OK IR %lu", value);

        /* Send this code to the IR emitter */
        irsend.sendSamsung(value, 32);

        sendConnection(connectionId, buffer);

      else {

        Serial.println("Unknown command");

        sendConnection(connectionId, "ERROR");

String sendData(String command, const int timeout, boolean debug) {

  String response = "";
  long int time = millis();

  while((time+timeout) > millis()) {
    while(esp8266.available()) {
      char c =;

  if(debug) {

  return response;

void sendConnection(int connectionId, char * string) {

  char buffer[128];
  memset(buffer, 0, 128);
  sprintf(buffer, "AT+CIPSEND=%d,%d\r\n", connectionId, strlen(string));
  sendData(String(buffer), 1000, DEBUG);
  sendData(String(string), 1000, DEBUG);

void closeConnection(int connectionId) {

  char buffer[128];
  memset(buffer, 0, 128);
  sprintf(buffer, "AT+CIPCLOSE=%d\r\n", connectionId);
  sendData(String(buffer), 1000, DEBUG);

Sunday, January 31, 2016

New inSPorte

In 2013, my friends and I developed an app during the first São Paulo Bus Hackathon  and now, after a few years, we will finally release a commercial (but 100% free) version of this app for iOS and Android!

With inSPorte you will be able to track all your buses in real time, besides that you will also be able to evaluate these buses, thus helping improve the public transit system!

You will be able to track your buses using your watch

So, if you live in São Paulo and use the public transit system of the city, you might consider downloading our app! (in Portuguese)

Monday, January 11, 2016

OpenGL and Arduino, will it blend?

I was working with Arduino recently and I thought: can this small ATMega328 run some 3D graphics? I realized that there are some projects about that, some of them are really good actually, and that was enough to get me inspired to try my own micro-project. So I thought, why not port OpenGL for Arduino? Let's see what a 16 Mhz 8-bit micro-controller can do.

And this is the result:

Cubes are easy!

STL models don't look so nice...

By the way, you got it right, that's the same display used on the old Nokias! As you can see, I didn't have a fancy LCD colour display, but for this demo that one is good enough!

It is not hard to see that most of OpenGL's features are missing. It has no backface culling, no frustum culling, no colours, no depth/z buffer,  no textures, no lighting, no shading, no shaders, etc...

Although limited, you can use this Arduino library to port some really basic 2D and 3D OpenGL programs to Arduino. You could even try to build a small game! But don't get too excited, you only have 32kb of program memory and 2kb of RAM (most of that already used by the library itself). You'll probably only be able to fit one low poly 3D model and nothing else.

Still want to try it? GitHub