Cova.os - design for trust, support and security

An integrated operating service that leverages intelligent humanized AI technologies. This system enables preventive cybersecurity, allows intuitive crime reporting online, and identifies digital hotspots. The speculative-human AI hybrid ‘Ebba’ guides & supports the online community against cyber offenses before, during, and after the incident.

Project background

This project is a partnership between Umeå Institute of Design and Swedish and Norwegian police forces. Our goal is to uncover the Swedish Police Authority's online presence and encourage people to proactively report crimes online, particularly those targeting children and young adults. With the increasing usage of online social media, it's vital to establish robust early crime prevention measures on the internet to protect youth online.

My role

I designed and finalized the design language, user interface, look, and feel. I also conducted design research and led user interviews. I was the facilitator for our co-creation workshops and gathered insights from our participants.

Project details

Project team - Anuja Tripathi and Lin Shen

Collaborators - Swedish and Norwegian police

Project Duration - 10 weeks

Context

We have police in the physical world. When a crime happens, we can call the police. This gets tricky in the digital world, and what is considered a digital danger? In this case, we want to protect youth groups especially, police know where to patrol in the physical world because when people report the crime in one area, police will start to notice and pay attention to that neighborhood thus sending more patrolling in that area. If someone is in danger, you call 112 in Sweden and police will send reinforcements to check, but how might we enable police to patrol online if they don’t know where to go? How do we know where these sketchy neighborhoods are online? and since this is quite a new arena, people can get exposed to online crimes easily without preparation and it can happen suddenly. The way to help the police is to encourage people to report crimes online, so police know where crimes usually happen and thus send more patrolling to that online spot.

Why does online safety for youth matter?

Online safety for children is a big problem and police do not know what to do. Police play a reactive role, how might we enable the police to have an active role and take a preventative approach instead of waiting for crimes to happen and take action after the damages are done and it is too late?

Current challenges

1 - Tedious reporting process online to the police

2 - Police do not know where online crimes usually happen

3 - Reporting language sounds cold and intimidating

4 - Lack of sophisticated tools to identify harmful content online for youth

output-onlinegiftools (1).gif

How might we create a safer digital environment for youth, develop humanized online reporting tools for harmful content, and provide the Swedish police with the data needed to identify digital crime hotspots effectively and take a more proactive role to prevent crime online?

Overview of our solution

An integrated operating service that leverages intelligent and humanized AI technologies, Chat GPT, and Dall E. This system enables preventive cybersecurity and allows intuitive crime reporting online, thus identifying digital hotspots. The speculative-human and AI HYBRID ‘Ebba’ guides and supports the online community against cyber offenses before, during, and after the incident.

For teenagers under the age of 18, cova.os has higher security filters and settings

For teenagers over the age of 18, cova.os has lower security filters and settings

Five functions

Identifies harmful content

To encourage teenagers to speak up against harmful content shared with them that causes them stress or negative emotions, cyberbullying, we utilize UX writing to offer emotional support with emphatic language to help them stay calm. Ebba will gently nudge youth to reach out to family and friends instead of staying in isolation, family and friends can help teenagers make better decisions and offer further emotional support and strategies on the next step.

Cova.os, our operating system detects a harmful image and it is blurred. Ebba - the name of our humanized AI sends an alert message to check if the image caused any negative emotions for the person in front of the phone.

Compassion - humanized AI language

Intuitive reporting process

An intuitive reporting process to guide you through each step of the reporting process.

There are two types of reporting features: Platform reports help you directly report these images or incidents to the police when you are sure you want to report them. However, sometimes, people hesitate to report, and that is when the provisional report comes in, meaning Ebba will help you to find a way forward, either save the report as drafts for now and talk with friends a bit and you can always report it later, as we know sometimes people are just not ready.

Digital hotspots gathering

Cova.os is thanking teenagers who took the simple act token of reporting, these help Cova to gather the digital hotspots for the police. There is also a clear process on where is the status of one reporting insistent to keep people who report in the loop.

Summary of police report

There will be a police report generated for the police side every 6 months to ensure the police can keep track of how the community feels in terms of safety level online, it also helps the police to identify hot spots and take a preventative approach.

Our processes informed our design opportunities

1 - Tedious reporting process online to the police: ‘Ebaa’, the name of our AI, our moderator to facilitate the reporting process

2 - Police do not know where online crimes usually happen. Cova.os create a bi-annual police report - an evidence pool for the police

3 - Reporting language sounds cold and intimidating. We will utilize UX writing to humanize AI with more empathy and compassion

4 - Lack of sophisticated tools to identify harmful content online for youth. Cova.os is integrated on all platforms.

First round of interviews to have context

The first round of interviews is the primary research to understand the context and what kind of pressing problems we will be solving. We started by talking with two people in the police force.

We need easy access to reporting online crime to the police, someone who guides and gives the right advice.
— Adam
Hard to identify digital hotspots in the online world.
— Adam
It’s not monitoring my kids, it’s making them aware of them.
— John

Co-creation workshop 1

We found a huge gap in online crime prevention for youth, that is reporting. We want reporting to fill the gap. how? Once you report a crime, the police know where to go patrolling online just like in the physical world.

Co-creation workshop 2

For presence, I want
24/7, 365, it’s a resource question.
— John

Our goal in this workshop is to visualize safety, and to imagine what it looks and feels like for people. And what is stopping people from filing police reports online?

I am scared of what’s gonna happen and don’t know what to do. What if this goes too far?
— Jenna
I had bad experience phoning the police, they follow the rules so strictly and forgot that there are people involved in these things.
— Mother Corne

Concept validation

“I will definitely use it.”

Local newspaper recognition

Reflection

figure 1. how we worked as a team of 3

figure 2. design togetherness in this project

Design togetherness is at the core of what I learnt in this project. Figure 1 is what we used in this project, and this helped us to work as a team with equal responsibility and fairness. In terms of team structure, after workshop and research, we did not divide our work into who is good at what, but we start to design the final interface together. For example, we started doing wireframing together, then it ended up with 15 screens in terms of wireframing. Then, what we did was to do the first 5 screens as an individual, and then we came together and gave each other feedback, then we chose a design, and a lot of times we combined them to make a design that best fits the design direction and design language. One of my goals is to learn workshop facilitation and UX/UI, and I get to do these things. This structure really worked for me and our team to come together and learn from each other. Feedback sessions are especially valuable, and we can see how our research and workshop get to come with fruition for each of us and then see how we think and visualize them differently.

What we did before was to write random notes on random papers, but when Monica introduced us to this giant white board where we could all put our notes down and contribute to this board, this was valuable for all of us. I will also take this reflection habit into my future work as a designer. When Monica comes in every two weeks and chats with us about teamwork on a high level, that is helpful for us as a team. It is more like a health check for a team, I will take that with me for my future career for sure, like a biweekly reflection. We embedded system thinking and co-design into our final design, especially in the design ethnography phase. We thought about different stakeholders a lot and how people beside the people we are designing for matter and can add value to this process. We could have done a better job at participatory design, we are a bit ate involving the teenagers in the beginning, but we did include them in the end.

During our design ethnography, it was hard to do participatory design in our design ethnography workshops, what we did in the first one, was more like facilitating and creating individually in a together environment. For next time, it will be nice to think about how everyone can participate together and design together, building on each other’s ideas instead of doing it individually. Also, it is hard to do a workshop without knowing what exactly you are designing, but I kind of get the point that we are designing for something within a larger topic, and in the end, somehow, they can tie together, if not all of it. Improvising during the workshop is also really important, for example, we were interviewing a parent and a child, but in this case, the parent was speaking on behalf of the child a lot, so I passed a note to my team discreetly and smoothly, not to cause any discomfort, in the end, we can chat with each of them, I took in the child and my team took in the mother. So, it went well.

Another challenge is to go over the insights in the video again and try to listen to them again, it is a time-consuming process, but I think it is necessary, we haven’t figured out a way to do it time efficiently. We also found when we design for system thinking and design for the people who are using them, which are teenagers, it is good tool for them, but it is hard to design one for all, I am not sure if that is possible. I have this constant debate on designing for a target group and designing for all. I was looking at the Stokke chair by Norway, these are designed as children grow up, it is covering for more targets, but still, what about disabled people?

One further point on teamwork, when everyone understands what is going on, you can create together, you can build on each other’s ideas, and I think that is something magical. I am glad that I experienced that in this project. Overall, this project has been a good experience, I learnt a lot and it made me interested more about ethics and critical AI, how do ethics play a role in technology. Now I want to explore further on this topic. (I will also have more reflections when I am polishing portfolios for this project later on for myself. Unfortunately, I did not keep a journal this time for some reason.)

Previous
Previous

aware

Next
Next

Cisco