Netflix knows what movies you like, Spotify knows your taste in music, and Tinder knows your ideal partner down to employment and eye color. All of these apps base their recommendations on algorithms tracking your desires.

But what if a recommendation algorithm could make suggestions based not just on what you want, but on what you need?

With the rise of dependence on technology, the use of algorithms has become an integral part of our decision making process. Many are unaware of how frequently algorithms impact our lives; online dating services, like Tinder, explicitly use algorithms to match users, while social media sites like Facebook implicitly use them to create the news feed.

Algorithms track user’s recent activity in an effort to show users what they want to see. This process results in what is called the ‘echo chamber,’ a system in which media users are only exposed to content reflecting their own views, creating social and political polarization.

Presidential Fellows in Data Science Mark Rucker and Derek Lam, sponsored by the Data Science Institute and the Office of Postdoctoral and Graduate Affairs, are designing an algorithm that acts paternalistically to help people make decisions that are not just reflective of their immediate desires, but are also based on what they value as good.

Rucker and Lam used statistical analysis to distinguish between user wants and values, and are designing a book recommendation app that presents users with choices going beyond their desires. The researchers hope that tailoring book recommendations to user values will help to mitigate the echo chamber effect.

Paternalistic approaches to algorithms can present ethical challenges because these algorithms nudge users to make decisions that deviate from their immediate desires, thus taking away a degree of perceived autonomy.

Rucker and Lam are reviewing philosophical work on paternalism in an effort to act in an ethically defensible manner. Their research includes investigating the ethics behind their approaches.

The researchers are creating the book recommendation app based on Goodreads public book reviews as a way to demonstrate a practical use of paternalistic algorithms and show how the echo chamber can be overcome. This research will provide substantial information about the ethics of applying mild paternalism and potentially pave the way for future research between philosophy and machine learning ethics.

Researchers:
Default placeholder headshot.
Mark Rucker
Ph.D. Candidate
School of Engineering
Default placeholder headshot.
Derek Lam
Ph.D. Candidate
College of Arts and Sciences
Sponsor:
Data Science Institute Office of Postdoctoral and Graduate Affairs
Completed in:
2018
Tags:
Machine Learning