Social Media
You. And The Morals of Technology
Summarized using AI

You. And The Morals of Technology

by Fernando Mendes

In the talk "You. And The Morals of Technology," presented by Fernando Mendes at the Balkan Ruby 2019 conference, the speaker explores the moral implications of technology and how developers navigate these complex ethical dilemmas. He encourages the audience to think critically about their actions and the broader implications of the technology they create.

Key Points Discussed:
- Introduction to Morality: Mendes begins by discussing the nature of morality, prompting the audience to reflect on various ethical scenarios related to everyday life and technology. He engages the audience with examples that challenge their moral judgments, such as taking free samples, using work supplies, or utilizing a neighbor’s Wi-Fi.
- Technology and Social Media: The talk delves into Facebook’s emotional experimental practices, indicating that social media platforms can unintentionally influence user moods and behaviors. Mendes highlights Facebook's moral responsibility in harnessing AI technology without causing harm.
- Privacy Concerns: Mendes addresses the increasing loss of privacy in the digital landscape. He provides insights into how the technology used in surveillance systems could lead to potential misuse and ethical implications for developers.
- China's Technological Landscape: The speaker touches on China's advanced technology situation, including the Great Firewall, social credit systems, and facial recognition. These examples exemplify the moral dilemmas faced by developers when building tools that can be used for surveillance and social control.
- Impact on Children and Society: Another focus is on the effects of technology on children, particularly how influencers on platforms like Instagram can shape dietary choices and behavior. Mendes warns of the datafication of children and the long-term consequences it might incur.
- 5G Technology and Artistic Opportunities: He discusses the implications of 5G, highlighting the digital divide and its potential to alter the landscape of artistic expression and revenue generation for artists.
- Ethical Boundaries in Science and AI: Mendes raises profound questions surrounding advances such as CRISPR and AI, urging developers to consider the possible societal consequences of creating ‘superior’ humans or autonomous decisions in self-driving cars.
- Call to Action: Mendes concludes by motivating developers to take responsibility for their contributions to technology and society. He shares his initiative, Include Braga, which helps those in need and encourages others to engage in meaningful community work.

Conclusions: The talk emphasizes the importance of recognizing the moral dimensions of technology and the responsibilities of developers in shaping a future that prioritizes ethical considerations. Mendes advocates for collective awareness and action among technologists to ensure that progress does not come at the cost of societal well-being and ethical standards.

00:00:21 Before we begin, can we give a big round of applause to the organizing team? They're doing great work!
00:00:33 It's been an awesome conference, and I ask you to be as loud as you can—let's hear it for that!
00:00:39 Thank you for being so loud! The hungover people at the back just went, "Oh, right." Okay, so before I begin the talk, I am legally obliged to say that these opinions are my own.
00:00:55 They do not reflect the views of my current or past employers, my mom, my dad, or my cat, and so on.
00:01:14 So, let’s start by talking about morality. Let's discuss the difference between what's right and what's wrong and review some cases that some of us have probably encountered in daily life.
00:01:30 I’ll present four examples.
00:01:36 To begin, let's do a show of hands for this situation: If you know that you will definitely not buy a product, but you see free samples available, is it wrong to take them?
00:01:51 Who thinks it's wrong? Raise your hand.
00:01:58 Not many, just one person.
00:02:04 Okay, good. Now, let’s talk about work supplies.
00:02:10 Is it morally wrong to use work supplies for your own benefit? Who thinks it's wrong?
00:02:21 Some people, right? What if you’re printing your own resignation notice?
00:02:28 Are there more people thinking it’s wrong to do that?
00:02:36 Let’s move on to another example: You go to the ATM, and it gives you an extra $100.
00:02:50 You double-check, and there was no extra money taken from your account. You wanted to take $100, but now you have $200. Do you think it's wrong to keep the extra money?
00:03:03 A lot more people think it's okay, getting good.
00:03:15 Now, let’s consider the neighbor’s Wi-Fi. Imagine you’ve just moved to a new place, and your ISP tells you that they will take about three weeks to provide you with Wi-Fi.
00:03:31 In the meantime, your neighbor has a Wi-Fi connection. Would you use your hacking skills to get his password for those three weeks?
00:03:43 Let’s see a show of hands—who would use it for three weeks?
00:03:55 A lot more people are willing to do so, that's good.
00:04:08 Now, you find out that your neighbor is lonely, an elderly person who spends his days online talking to his grandkids through video calls.
00:04:23 Since you got his Wi-Fi password, his internet connection has been bad, leading him to be unable to communicate with his grandchildren and increasing his isolation.
00:04:35 Knowing this, would you still use his Wi-Fi? Show of hands.
00:04:50 Most of you see how a small act, like using someone's Wi-Fi, can have a significant impact.
00:05:04 Now, let’s talk about company policy. Suppose you’re in charge of the networks in your company, and your best friend’s partner also works there.
00:05:19 One day, an outgoing email goes into quarantine, and they messaged you to let it go. You trust them because they’re your best friend’s partner.
00:05:35 Later, when you inspect the quarantined email, you find out they are having an affair.
00:05:44 What would you do? Would you confess to your best friend, knowing it may cost you your job for letting that email go?
00:05:57 Or would you keep quiet and let your best friend find out on their own? Who would confess?
00:06:05 A lot of people.
00:06:11 Now, let’s assume that your company is being audited. Your supervisor asks you to delete corrupt backups requested by auditors, knowing it might cost you your job to refuse. Would you do it?
00:06:23 Who would refuse? Again, mixed feelings.
00:06:30 Lastly, you’re working for a client with a website that recommends a drug based on user responses. The clients' requirements state that the website must always recommend their drug, unless the user is allergic.
00:06:43 You find out that one possible side effect of the drug is depression and suicidal thoughts.
00:06:57 Would you follow the instructions and promote the drug? Who would do it?
00:07:06 No one here, but the truth is, this has happened in real life.
00:07:20 Bill Soraa wrote a blog post about it after he didn’t know the side effects.
00:07:33 Later, he found out that a girl took the drug, went online, and then tragically took her life.
00:07:40 This is a part of the code Bill is more ashamed of.
00:07:51 Let’s lighten the mood a bit.
00:08:01 My name is Fernando. This is me in 1997 planning world domination.
00:08:14 I eventually realized that world domination needs to take baby steps. Two years later, I was focused on fashion domination.
00:08:27 I work for a company called Seville; we’re in Portugal.
00:08:41 People think that life in Portugal is pretty much like this, and I'm here to tell you that it's exactly like that.
00:08:56 We all have wine by our computers, we have pictures of Cristiano Ronaldo, but one small detail—we also have debts to Germany.
00:09:07 We actually look a bit more like this. And you must be worthy to have the codfish.
00:09:19 Now that we've laughed a bit, let's delve into the serious part again. I’ll go through some cases to blur the line between good and bad, especially related to technology.
00:09:30 Let’s talk about Facebook. Most people are aware of the social divide and issues that arise from social media.
00:09:41 It polarizes reality, tailoring what users see based on their tastes. However, let's focus on another aspect.
00:09:54 It's easy to discuss the experiment where Facebook tried to influence user mood. Here's their study, and a direct quote from the experiment.
00:10:09 They suppressed positive content, leading to less positive posts.
00:10:19 This indicates that posting less positive content correlates with depression.
00:10:28 They continued the experiment, reducing exposure to negative emotional content, and found that people became happier.
00:10:38 This was the first experimental evidence of emotional contagion via social networks.
00:10:46 The conclusion validly points to not just peer influence but also suggests how emotions can be influenced by artificial intelligence.
00:11:01 The question now is: how long has Facebook been doing this? In an interview, the engineering manager for newsfeed ranking revealed that they were using AI instead of their older algorithm.
00:11:14 This article is from 2013, and they had the AI internally for two and a half years by then.
00:11:28 So, it means they had been influencing emotions without us realizing it since at least 2010.
00:11:39 Facebook has the moral duty of applying their powers responsibly.
00:11:52 At the very least, we expect them to not bring harm. They have engaged in suicide prevention efforts.
00:12:06 Facebook enables analysis of individuals who might be depressed or in a near-suicidal state, flagging them for support options.
00:12:17 In very serious cases, if there is an imminent danger of self-harm, Facebook may contact authorities or loved ones.
00:12:30 This idea demonstrates that technology isn't binary—it's not simply good or bad.
00:12:42 There are a lot of bad sides to Facebook, but it’s a multi-dimensional issue.
00:12:53 Nevertheless, the harm it can cause and the good it can bring are evident.
00:13:05 Let's discuss the loss of privacy. It’s easy to highlight the negative impacts of social media. However, I want to peek into the future.
00:13:25 We’re living in a time where it’s easy to look into Orwellian societies, and I’ll touch upon China but clarify—I’m not trying to accuse or discriminate against any nation.
00:13:38 I chose China because its technology usage is quite advanced, making it easier to showcase potential good and bad.
00:13:50 Let’s start with the Great Firewall. We all know about it; numerous websites are blocked in China.
00:14:02 These include Google, YouTube, Vimeo, Netflix, Facebook, Twitter, Instagram, and many more. Recently, Wikipedia has also been banned in all languages.
00:14:20 As a developer creating this Great Firewall, would you feel comfortable not allowing access to knowledge?
00:14:30 Now, consider the CCTV systems. A BBC reporter allowed them to take his facial profile while walking through the streets.
00:14:44 It took just seven minutes for them to locate him through the thousands of cameras.
00:14:58 They possess facial profiles of citizens to prevent crime, but it is easy to misuse this technology for political persecution.
00:15:12 If you were a developer building this tool, would you go forth with it?
00:15:22 Let's talk about search engines. As I mentioned, certain search engines have been blocked, and they have their version.
00:15:35 Google's Project Dragonfly aimed to provide censored search results in China. This project was shut down after widespread dissent.
00:15:46 It shows that when one person stands up, others see themselves in the situation and can also express dissent.
00:15:58 Now, one of the most notable examples is the social credit system. This system rates individuals based on their behavior.
00:16:11 If you commit a crime or refuse to pay a bill, your social credit points drop, affecting your access to public services.
00:16:25 In reality, companies like Alibaba and WeChat control parts of the social credit system—creating social divides.
00:16:41 Has anyone seen Black Mirror, particularly the episode called 'Nosedive'? It’s so relevant to our current situation.
00:16:57 In that episode, individuals rate others based on interactions, mirroring what is happening today.
00:17:10 It’s unsettling how this could lead to extreme social division. Additionally, there are reports of Muslim individuals being taken away to facilities to surrender their passports.
00:17:32 The government reportedly wants to prevent religion from surpassing its intentions.
00:17:48 Various articles have documented the technical side of how they use social networks and databases to identify individuals.
00:18:02 In December 2018, people fleeing these camps provided testimonies about forced labor camps.
00:18:13 We, as developers, create these tools; some enjoy the technical challenge of identification.
00:18:24 The drive to profit has led to Google’s compliance with censorship in China.
00:18:36 Yet, our society permits this to happen, turning a blind eye rather than condemning these actions.
00:18:50 It reflects on us as developers and society at large. We are the first generation with digital footprints.
00:19:04 Parents post ultrasounds and biometric data of their kids on social media.
00:19:18 This behavior is socially acceptable and opens the door to stalking or even extortion.
00:19:27 There have been documented cases where hackers accessed digital records to extort parents.
00:19:40 We build large data collections, and while I'm responsible for my own data, what about the data of those who aren't even born yet?
00:19:55 We’re talking about the datafication and manipulation of children.
00:20:10 A recent study examined how Instagram influencers affect children’s behavior.
00:20:23 Kids exposed to unhealthy food influencers opted for unhealthy food, while those exposed to healthy food showed no difference from the control group.
00:20:35 This indicates that promoting healthy behavior is more difficult than unhealthy consumption.
00:20:48 As developers, we build the tools that exploit this behavior.
00:21:01 Let’s discuss 5G. It’s marketed for speed, but the costs are not just financial.
00:21:14 For developing countries, the infrastructure and potential digital divide are significant issues.
00:21:29 The internet was promised to unite us, yet it's creating divisions.
00:21:43 Moreover, access to information has its downsides. We fought for free content, but is this the right move?
00:21:56 Prior to the internet, 90% of revenue for artists came from CDs. Now, artists must tour to earn a living.
00:22:08 Some artists burn out from touring; the music industry is favoring mass appeal, leading to a decline in artistic diversity.
00:22:22 These challenges reflect across all creative sectors; we lean towards supporting bigger artists.
00:22:35 I want to touch on neural engineering and CRISPR technology.
00:22:49 We are nearing the possibility of creating superior human beings, but what will be the implications for our species?
00:23:01 Will we develop superhumans used for war, or will society segregate further? Has anyone read Asimov's Robot series?
00:23:16 In it, humans face consequences for creating superior races.
00:23:27 Should we consider halting scientific progress because of ethical dilemmas? Throughout history, science has caused harm.
00:23:41 However, I firmly believe in science, but we must recognize the potential ramifications.
00:23:58 As discussions on AI advance, we must evaluate how decisions will be made between human lives.
00:24:05 In self-driving cars, choices will have to be made on who to save in accidents.
00:24:22 I encourage you to visit moral-machine.mit.edu to explore these consequences.
00:24:33 We aren’t stopping progress, but it’s essential to acknowledge the dark side behind every invention.
00:24:45 The duties of developers extend beyond technical skills; we are the architects of this future.
00:24:58 Many companies exist primarily to profit, leading us to moral dilemmas.
00:25:11 Most of us believe we would resist unethical practices, but that’s not always the case.
00:25:25 Milgram’s experiments showed how ordinary people can comply with authority even under extreme pressure.
00:25:34 Would we stand up against immoral features recommended by our bosses for profit’s sake?
00:25:47 This becomes a reflection of our society and how we publically shift blame to executives and companies.
00:26:01 As developers and society, we need to consider our accountability in these actions.
00:26:10 We trade personal comfort for the common good, which leads to global challenges like climate change.
00:26:25 We prefer short-term benefits rather than long-term sustainability.
00:26:39 So, as a community collectively, what can we do about this?
00:26:50 I believe we can do much better. It’s crucial to take immediate actions, no matter how small.
00:27:06 I’ve engaged in non-profit work through CoderDojo, realizing that the audience attracted was primarily upper-middle-class kids.
00:27:21 I sought to create meaningful impacts rather than just raising awareness.
00:27:29 Hence, I chose to build the project from scratch, gathering talented individuals.
00:27:39 We aimed to improve our community without bureaucracy distracting us.
00:27:51 Consequently, we founded a platform called Include Braga to provide support where needed.
00:28:04 We matched anonymous letters from children in need with users willing to provide gifts.
00:28:14 During our first year, we helped about 170 children and elderly people, and doubled our impact the next year.
00:28:29 I share this to motivate others in the developer community to find your path and potential to make a difference.
00:28:44 We are the first generation that can create change without substantial resources.
00:28:56 We possess all the knowledge at our fingertips; we have the ability to reach people worldwide.
00:29:10 In the past, being able to accomplish something similar was unimaginable.
00:29:23 Despite lacking resources, we have now tools available to us.
00:29:35 The cost of our impact was minimal; a project like Secret Santa only cost each participant a small sum.
00:29:48 I want to emphasize that though I do not claim to have all the answers, we can engage in helping our communities.
00:30:05 We started this conversation by blurring the lines of morality and assessing the impact of technology.
00:30:18 This is only the beginning of a deeper conversation that our community must have.
00:30:31 You all have the skills to impact the world positively. Change doesn’t happen by chance.
00:30:40 Change occurs through commitment to a cause that resonates with you. You need nothing but your intellect.
00:30:53 With that, I urge you to embrace this cause, make an impact, and strive for positive change.
00:31:05 Thank you.
Explore all talks recorded at Balkan Ruby 2019
+3