Memwris Mobile UI


UI and Motion Designer



Memwris approached me to create UI animation prototypes for their upcoming mobile programming app. Memwris promises a mobile UI for intuitive, efficient programming using only touch gestures. The animations in the app UI are programmatically driven by the user's gestures. As such, bespoke prototype animations are core to communicating the fluid experience of writing code with Memwris.

Memwris features the promo video and animations I created as the centerpiece of their website to communicate the value proposition of their app. The video is the first point of communication when they explain the app's value proposition to investors and customers.


The client provided a VO script and wireframes that described animation keyframes using text and geometric equations, from which I synthesized detailed storyboard sketches. These storyboards allowed me to verify the careful timing of gestures and the movement of UI elements as I prepared to move the project into After Effects.

Storyboard sketches to communicate  timing of gestures and animations.

Storyboard detail. Click to enlarge.

The dynamic nature of Memwris UI prevented me from relying on the standard toolkit of UI animations found in design prototyping apps. While recreating the look and feel of a desktop programming environment for a mobile form factor in After Effects, I kept a close communication loop with the client to make sure I stayed on the right track.

For example: After I completed the initial UI animations, the client's software engineer sent whiteboard drawings paired with lists of video timecode so I could animate the precise finger gestures a user would make to create the behavior I had illustrated in the app UI.

Two-finger text selection animation.

Fluid touchscreen typing with one or two fingers.

Creating some of the fluid animations in After Effects required creative collaboration with the client. Rather than spend hours replicating the complex gestural keyboard UI in After Effects, I motion tracked text to video of the keyboard shape animations from the alpha build of the app. This process involved finding the line between efficient animation and what the human eye sees as a cohesive interface.