Hi, I'm Onofrio, or just simply call me Ono! I'm an Interaction Designer currently living in Markgröningen, close to Stuttgart. Since 2020 I work as a UX Lead for at Robert Bosch Power Tools, where I work in a crossfunctional and agile team to develop new digital businesses for Measuring Tools. My experiences as a coach and UX Project lead in my first years at Bosch helped me to expand my methodogical horizon and I learned how to work with people from different disciplines and with different mindsets. Here on my website you can see a little extract of the projects I worked with and managed. If you became curious and you would like to get to know me let's get in touch.
Bosch Sensor Technologies developed a technology that allows to build extremely small laser projectors that they also build in home apliance prototypes like the kitchen projector they presented on CES in 2019. The idea after that was to make a projector that would perfectly fit in a glasses frame. Bosch Sensor Technologies approached us with their technologie and asked us to find and evaluate meaningful use cases for their technology. This approach was not the usual way we handle projects but the technology did sound promising so we took over the challenge.
We were a team of 5 professionals. Two Designers, a developer and 2 researcher. This sounds like the perfect recipe for the perfect design thinking meal.
After having first open qualitative interviews we started with a brainstorming to come up with several use cases that would make for our user group. We also thought of how we could further improve the glasses with sensors to create even more meaningful experiences.
One of the biggest challenge in the interfaces design was that we couldn't use any black in our designs since black is no color of the light spectrum. Also really dark colors where hard to read. We spend a lot of time with a black box to test the different colors and fonts and to decide which ones where leggible and which ones we had to exclude from our designs. We also avoided filled icons and graphics since they felt disturbing to us when projected directly into the eye.
June 2019
2013 I started working for Audi as a consultant at Spiegel Institut GmbH. After my internship at Audi in 2012 I already knew the whole Audi team and how they worked. My first project should be the Audi Tablet (the internal project name was Smart Display). I got the opportunity to lead a small UX team and lay the headstone for Audi's new rear seat entertainment generation.
The challenge was to create a GUI concept for a rear seat entertainment that is based on the Audi's MMI overall GUI concept but works on a touch screen. Since Audi never did touch interfaces before, the Audi tablet was the first of its kind. The Audi tablet should be introduced in the Audi Q7 for the first time. The Q7 introduced a whole new MMI system in the car's cockpit, internally called MLBevo
. Though it was still a remote controlled system it added an all-in-touch display for handwriting and touch gestures, but still the interface was not made to be touch compatible.
By the end of 2012 Nokia closed down their R&D Center in Ulm and Audi saw an opportunity of gaining new knowhow in the mobile device world. So they installed a new soft- and hardware development Center in Ulm. Most of the colleagues I was lucky to work with were ex-nokians that had a lot of experience in agile software development. We were a Team of designers, software developers and engineers and ready to create an Audi tablet that fits in Audi`s ecosystem.
We worked in a scrum-like process. Since the guys in Ulm were not in situ we had longer sprints (one sprint was 4 weeks) and did our daily scrums via conference call. For our sprints we either met in Ulm or in Ingolstadt. The distance sometimes made it really hard to be as quick and flexible as you need to be in a Scrum. Especially the stakeholder meetings were really hard to set up because Audi never worked with a Scrum process before, so they were not aware of the importance of the regular meetings and complete presence of every stakeholder. Even though we had some trouble at the beginning I managed to bundle information, communicate them and keep everything running from an interface point of view.
The credo for the Audi tablet was: create an audi touch-interface on an android OS that doesn't look like android
this task was not easy since android has pretty strict guidelines regarding their OS. We discussed every eventuality and compromised on a lot of topics. But still it had to look like it's a pure Audi tablet. The homescreen for example has always been one of the keyscreens in Audi's MMI. The ring-shaped menu is a metaphor for the rotary knob and gives you a hint how you have to turn it to reach the next entry. We knew that we couldn't totally change it, but we also knew that this screen was optimized for a remote system. So we decided to keep the ring but to arrange it horizontally and the main menu entries were sitting
on it side by side.
Also the main applications like the media section shouldn't differ to much from the front system's interface. We had to integrate the controls in the media menu because we had no remote control or hardkeys to skip, pause or scrub the playtime.
On the left side we added a source-navigation menu so the user could quickly change the media's source. As we were designing for the rear-seat, the entertainment menu was one of the hubs of the Audi tablet.
After taking this project to serial production Audi hired me as a UX designer and I became part of the taskforce that had the mission to define Audi's new infotainment for the A8's front system. We pitched three different concepts. After the decision which basic concept it should be, we designed the guidelines and principles that we would need in the later development. After the taskforce I was called to be the GUI specialist for the new system. Currently I'm working on developing ideas for integrating system intelligence in the car and how it should be presented to a user.
June 2015
To get from one point to another nowadays there's more than possibility. Even though the oil price is rising more and more people still prefer to take the car instead of using public transportation. We also didn't use any public transportation so we decided to do an experiment in the field: we suspend ourselves somewhere in Stuttgart and tried to find our way to the stadium without using any smart devices. After a lot of trouble we finally made it to the stadium, that was the main reason why we wanted to optimize the information structure at busstations.
It was our goal to organize, structure and to create an interactive experience for informationshotspots. The interface should be visually appealing and intuitive. Thereby we want to help the user to find his destination as easy and fast as possible so we can take any inhibitions from him to use public transport
After our goal was defined we had to find out who our user is and what are his problems with the existing systems. We started with a shadowing at Stuttgart main station. We realized that most people didn't even read the plans, because they already knew their connection. But there was a group of Asian people having a hard time understanding what that plan was about. They struggled with the poorly transleted information and the static pictures. After that we also saw some elder people having trouble with the information. After this observations we decided to create a questionary and do some interviews with real users.
My daughter looks it up for me and prints it out
Sophie (housewife)
I don't use time schedules, the maps sometimes. Something more interactive would be helpfull, depending on the user interface
Darren (international student)
There are already plenty of apps that try to solve the problem of disorientation in public transportation. So we had a look into them and tested five different apps (DB-app, Moovel, transit and the VVS app). After that, we realized, that we didn't want to create another app
so we decided to focus on user groups that, either don't have a smartphone or that don't have the possibility to use it (empty battery or no wifi connection). We decided to build a coexistend system, that works together with the actual guiding system. We also took a look at the interface of the ticket machine. We found out that the interface was already three years old and that they don't have any updating mechanism. After a deep analysis we wrote down the painpoints that we wanted to consider for our interface.
After the card sorting we defined personas to never lose the focus on the actual user. The Tourist
is a korean backpacker that doesn't have a smartphones. The second one is the Student
that mostly uses his smartphone but always forgets to recharge it. We had two more personas but we decided to focus on these two, because we could empathize better with them.
Now we had the base to start with our concepts. We started a technology research and decided to use a touch screen for a ticket machine. But we had problems to find a technology that could replace a paper ticket. The requirements were to have a display that is flexible, dynamic and doesn't necessarily need power. After a discussion with a good friend of mine, he drew my attention on his new kindle paperwhite. I asked him how that thing
worked and he explained me, that it's an e-paper display.
After looking a little bit deeper into the technology, me and my project partner, realized that this is the technology we were looking for.
Our interface is based on a detailed information architecture that we roughly clustered in three content areas: an input area, an information layer and a layer for the visual content.
At the end of our project we wanted to experience our interface so we build a wooden box as a container for a 24 inch touch screen, so we would have the proper ergonomic experience. With a 3.5 inch E-paper display we simulated the Ticket. My main task was to prototype the interface in HTML. It had a working map that showed actual information of the public connections. I used a google maps integration and via manipulation of the json files we managed to create an own maps style.
David Nickel
February 2013
Our approach on this project was different than usual. Normally as a designer we identify problems and than try to find a way to solve it. My teammates and I started this project the other way round, we first did a deep and tidious technology research and then used this to solve a problem we didn't even know in the beginning. After the research we decided to focus on EMG muscle sensors. We thought it could be an interesting way to track gestures without the need of cameras or similar devices.
Download the Paper about EMG sensorsaccording to the WHO website Over 5% of the world’s population – 360 million people – has disabling hearing loss
we thought that these people should have the chance to express themselves without restrictions. At that time medical institutions were already working on better hearing aids and there were already schools that supported deaf people to learn to speak. But why take their beatiful way of communication?
I once saw a documentary about deaf. A deaf person was interviewed by a journalist in a deaf cafe and even though nobody was speaking out loud the room was full of life and dynamic discussions. People in there were using sign-language and they did it as if its the easiest thing in the world. So why should the rest of us force this people to lose their way of communication?
We thought of a concept that enables deaf people to speak with people without hearing disabilities but still allowes them to still use sign language.
We focused on four different contexts in which our product could be used. The first two were the speak and reply
these two are the base for every conversation. We tried to figure out how the system could display a question and how the deaf person could receive an answer. For the first part we thought of the gesture to audio
approach and the second part should be realized via speech to text
technology.
We realized a short concept video to give an idea of how the system could work in everyday life. The video shows a person without hearing disabilities that asks a deaf person If she can ask her something
at this moment she puts on her gestics device and replies how can I help you?
she wants to know the way to the trainstation and the deaf person answers that she can show her the way. The concept video shows a hologram interface, because we had an idea of future phones projecting the dialog on a holographic display.
After depicting the future we wanted to proof that we already could do a small step in translating sign language. Therfore we oriented on the messaging context since for messaging we wouldn't neccessarily need to detect whole words but it would also be enough to recognize single letters.
Gestics Audioglove from Onofrio Di Franco on Vimeo.
Even though the glove shows just three different signs paper, sciccors and stone
we could have done every combination of the sign language alphabet. The glove was realized with an Arduino lilypad and flexsensors.
Luise Peschek, Marta Miosga, Sohyun Kim
February 2010
Arduino,
Processing,
After Effects
Some describing Text
The challenge was to create a GUI concept for a rear seat entertainment that is based on the Audi's MMI overall GUI concept but works on a touch screen. Since Audi never did touch interfaces before, the Audi tablet was the first of its kind. The Audi tablet should be introduced in the Audi Q7 for the first time. The Q7 introduced a whole new MMI system in the car's cockpit, internally called MLBevo
. Though it was still a remote controlled system it added an all-in-touch display for handwriting and touch gestures, but still the interface was not made to be touch compatible.
By the end of 2012 Nokia closed down their R&D Center in Ulm and Audi saw an opportunity of gaining new knowhow in the mobile device world. So they installed a new soft- and hardware development Center in Ulm. Most of the colleagues I was lucky to work with were ex-nokians that had a lot of experience in agile software development. We were a Team of designers, software developers and engineers and ready to create an Audi tablet that fits in Audi`s ecosystem.
We worked in a scrum-like process. Since the guys in Ulm were not in situ we had longer sprints (one sprint was 4 weeks) and did our daily scrums via conference call. For our sprints we either met in Ulm or in Ingolstadt. The distance sometimes made it really hard to be as quick and flexible as you need to be in a Scrum. Especially the stakeholder meetings were really hard to set up because Audi never worked with a Scrum process before, so they were not aware of the importance of the regular meetings and complete presence of every stakeholder. Even though we had some trouble at the beginning I managed to bundle information, communicate them and keep everything running from an interface point of view.
The credo for the Audi tablet was: create an audi touch-interface on an android OS that doesn't look like android
this task was not easy since android has pretty strict guidelines regarding their OS. We discussed every eventuality and compromised on a lot of topics. But still it had to look like it's a pure Audi tablet. The homescreen for example has always been one of the keyscreens in Audi's MMI. The ring-shaped menu is a metaphor for the rotary knob and gives you a hint how you have to turn it to reach the next entry. We knew that we couldn't totally change it, but we also knew that this screen was optimized for a remote system. So we decided to keep the ring but to arrange it horizontally and the main menu entries were sitting
on it side by side.
Also the main applications like the media section shouldn't differ to much from the front system's interface. We had to integrate the controls in the media menu because we had no remote control or hardkeys to skip, pause or scrub the playtime.
On the left side we added a source-navigation menu so the user could quickly change the media's source. As we were designing for the rear-seat, the entertainment menu was one of the hubs of the Audi tablet.
After taking this project to serial production Audi hired me as a UX designer and I became part of the taskforce that had the mission to define Audi's new infotainment for the A8's front system. We pitched three different concepts. After the decision which basic concept it should be, we designed the guidelines and principles that we would need in the later development. After the taskforce I was called to be the GUI specialist for the new system. Currently I'm working on developing ideas for integrating system intelligence in the car and how it should be presented to a user.
June 2015