Automatic code generation from sketches of mobile applications in end-user development using Deep Learning

A common need for mobile application development by end-users or in computing education is to transform a sketch of a user interface into wireframe code using App Inventor, a popular block-based programming environment. As this task is challenging and time-consuming, we present the Sketch2aia approach that automates this process. Sketch2aia employs deep learning to detect the most frequent user interface components and their position on a hand-drawn sketch creating an intermediate representation of the user interface and then automatically generates the App Inventor code of the wireframe. The approach achieves an average user interface component classification accuracy of 87,72% and results of a preliminary user evaluation indicate that it generates wireframes that closely mirror the sketches in terms of visual similarity. The approach has been implemented as a web tool and can be used to support the end-user development of mobile applications effectively and efficiently as well as the teaching of user interface design in K-12.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods