Sara Birchard
Digital Product / UX Designer
iPhone_2.png

Clairvoyance for Verizon

 

— created for a 2-day hackathon @ Verizon

 

This project was created in a 2-day design sprint for a design jam that was hosted by Verizon in October of 2017. The prompt went as follows: come up with an idea for a type of Artificial Intelligence (AI) that aids users in discovering new content

Team [404]

The team consisted of myself and three other DT students from Parsons: Nicolás Hernández Trujillo, Archit Kaushik, and Dylan Negri. Our multidisciplinary backgrounds included electrical engineering, advertising, software development and psychology. We used each of our strengths to focus on making the final product the best that we could in such a short span of time.

Screen Shot 2018-02-18 at 12.52.40 PM.png
IMG_3544 2.JPG

Problem & Solution

Problem: 

  • Spending too much time browsing through search results

  • Spending too much time finding the most relevant content

Solution:

  • A better indexing tool that constantly learns about your search preferences

  • An efficient way of presenting personalized search results

My Role

We began by white-boarding and developing our concept together. Once we all agreed on a concept, we split up to work on separate components of the project individually while intermittently consulting with our team members about important decisions. My primary role within the team was to create the UI of the application that we made. I also played the voice of Claire and helped to create the final presentation.

IMG_1147 2.JPG

Features

  • Presented in a continuous sequence of 10-second video clips that highlight the key points in the video. Claire will have the ability to recognize sound waves, respond to micro-expressions, and continuously deliver content based on what it has learned from your preferences and behaviors.

  • Claire will have the capacity to sift through video database. (Ex. Meta tags, Hashtags, keywords, popularity, number of views, likes/dislikes etc.)

  • Claire can collect sound wave keywords.

  • Users can add highlights to video content and create timestamps with “likes.”

  • Claire will recognize and compare sounds.

  • Claire can look at every frame of a video and understand that content.

 
cla.jpg

Final Presentation