Ten commandments of developers
At the beginning was the word. And that word was code.
In the first world countries, high technologies control the supply of electricity and logistics, provide access to information and services. Doctors at a distance of several thousand kilometers from the patient carry out brain operations using robotic tools. We can talk with relatives and friends at any time, being on the other side of the planet.
And also, lines of code control combat drones, the operators of which don`t really choose who fell under the strike – militants or non-combatants. They form the mechanisms of attention retention in social networks, causing addictive behavioral scenarios, FoMO (Fear of missing out – lost benefit syndrome), dopamine burnout and depression. Manage Darknet’s electronic drug and arm markets.
The profession of an IT developer has grown and matured - after the rights and privileges, it is time for responsibilities and restrictions to appear. How it was with doctors, how it was with the military.
To the barrier!
The developer returns from work, in his head there is a checklist from his wife “buy cabbage, bread, butter, sausage, hair balm, sponges, Venus razors, a new Cosmo magazine” – and it doesn’t even occur to him that at this time thanks to his code, on a distant snowy landfill is walking alone the hills GHR(Giant Humanoid Robot). Meanwhile, the developer recalls that the list also includes "Johnsons baby oil and yogurt with beneficial bacteria."
The IT architect sits in a comfortable leather chair, looking opposite at the man in uniform and suppressing the instinctive desire to bark “Oh yes, oh yes, Sir!”. And he vaguely, but although roughly knows where his team`s code will eventually go.
Owners of huge corporations, sipping Glenfiddich 1937, together with a ten-stars general, are already calculating the profits from the joint project.
Collective responsibility is good in words. But it so poorly applicable in reality. An ordinary performer never really thinks about how his developments are used in general. Moreover, the “reality, that is given to us in sensations” hints that the concepts of absolute good and absolute evil are very conditional – and depend not only on the era and cultural context, but sometimes simply on geographical coordinates and political demand.
It’s worth to think about ethics, but the main thing is – not to slip into a famous hokku:
“Men desire male flesh.
I stand alone in the musketeer`s cloak with the sword.
Far from vicious pleasures.”
“GOODGLE” and the Maven Project
In March 2018, it became known that Google was helping Pentagon to develop artificial intelligence, which analyzed real-time video from combat drones. Google top management praised the value of this project – the budget was to reach $ 250 000 000.
Project Maven is a specialized software for high-altitude aerial photography, capable of recognizing moving objects, tracking their movement path and transmitting information in real time to the servers of the US Department of Defense.
Google employees signed a petition demanding the “Corporation of Good” to stop participating in the Maven Project, as well as to avoid joint projects with the Pentagon in the future. In a petition, Google developers said: ”We believe that Google shouldn`t engage in military business. Therefore, we request the annulment of MP, as well as the preparation, publication enforcement of a clear and accurate policy that Google and its contractors will never create combat technology.”
The Google developers` petition in an open letter was supported by 90 well-known scientists who work in the field of artificial intelligence. In addition, Tech Workers Coalition, a non-profit organization that brings together IT workers in the states of California and Washington, has appealed to other large IT companies to refuse to cooperate with the US Department of Defense.
Google employees who opposed the Maven Project explained their position as follows: people, not artificial intelligence, should analyze information from battle drones, cause the cost of the error is too high. Undoubtedly, artificial intelligence can distinguish a moving object from a motionless and a living creature from an inanimate object. But it can`t determine whether a person is a militant or non-combatant.
It is clear, that Pentagon doesn`t strongly give starry and stripy shits about how many civilians will fall under humanitarian bombing. But as Google’s reaction showed, developers really care how their code is used. Gizmodo, citing sources in Google, shared information that as a result, about 10 employees left the company for ethical reasons.
One of the quitters said: “At some point, I realized that I couldn`t advise to go to Google, knowing what I know. And if I can no longer recommend this company to other people, then why am I still here?”
Google didn`t discuss with the press the dismissal of employees. But in June 2018, the company refused to fulfill the Pentagon order.
The US Department of Defense got what it wanted anyway. Instead of Google, development services were offered by Anduril Industries, whose owner is Palmer Lucky, the inventor of the Oculus Rift virtual reality helmet and one of the founders of Oculus VR, who is clearly not very concerned about the ethical issues of developing military technology.
“Corporation of Good” and the Chinese searcher
A little less known was the story of the development of a search engine for China. China’s information policy is well known - the Golden Shield project blocks access to a number of foreign sites, the Chinese media cannot link to news taken from foreign news sites without special permission, pages in the search results are filtered by keywords related to state security. In fact, the "Great Chinese Firewall" is the first case of deep censorship on the net, a kind of “hello” from the "wonderful new world" of the future.
In august 2018, more than 1,500 Google developers signed a collective letter protesting the development of a censored searcher for China. The developers said that the existence of a secret project within Google`s walls to create such a search engine that will censor content at the behest of Beijing raises complex moral and ethical problems
Google has long wanted to return to the Chinese market. And the creation of such a searcher was one of the conditions in the negotiations. For the sake of the large Chinese market, Google is ready to depart from the principles of freedom of speech and submit to the requirements of the Chinese authorities.
For this, a separate team was created in the company, which since the spring of 2017 has been developing a project code-named Dragonfly. Additional resources were transferred to the project and accelerated the work as much as possible when, in December 2017, Google CEO Sundar Pichai met with senior government officials from China.
After the developers` protest, Google executives had to suspend the project. Sundar Pichai tried to reassure developers at the meeting: “We`re not close to launching a search product in China. And whether we`ll do it, and whether we can do it, is completely unclear.”
The point of no return
The global IT market has a unique situation:
High deficit on highly skilled developers.
The rapid development of the IT sphere, when universities simply don`t have time to create such number of specialists that will meet its needs.
The dependence of almost all sectors of the economy on IT-specialists.
Dependence of the success and profitability of large international companies on internal development teams.
As a consequence, developers can dictate terms to those who are used to command by themselves.
In human society, codes of ethics usually appear in those professions that are related to the life and health of people (militaria oath, doctors` oath). It is a pity that there is no oath of politics – from their decisions, personal fears and backroom games a lot of people die. Sometimes it seems that politics is not a profession, but a mental deviation.
Now the profession of an IT developer has come close to the limit when people's lives and health, both physical and mental, depend on the specialist. Unfortunately, there are only few cases where developers raise complex ethical issues. For the most part, people prefer not to think about the consequences of their work.
I think, that sooner or later, there will be a wide discussion among IT-specialists; so, that ethical questions will not raise in response to the dubious projects of a particular corporation, and will not come down “from above” to reassure employees. And, for sure, that movement “from below” will form simple and clear rules, that, probably, might not make the world better, but will not allow it to become worse.
Afterword
Let`s try to imagine how principles of the 21-century developer might sound:
- A developer cannot harm a person or, through inactivity, allow a person to be harmed.
- The developer must obey all the work tasks, that the team leader gives, except in cases where these orders are contrary to the First Law.
- The developer must take care of his safety to the extent, that it doesn`t contradict the First or Second Laws.
- The developer should not take someone else’s code without demand.
- A developer cannot create the code, that is potentially capable of harming human`s life and health.
- A developer cannot create the code, that is capable of restricting basic human rights – freedom of speech, thought, and belief.
- A developer cannot create the code, that will be used to limit people's access to reliable information.
- A developer cannot create the code, that uses a person’s neurophysiological and psychological vulnerabilities, to gain control over his behavior and time.
- The developer must be prepared for the fact, that nothing usually goes right and ethically.
- The developer must be prepared for the fact, that everything that can go wrong, goes wrong.
Questions:
- Is it important for everyone to have ethic code? Why should we increase our level of morality?
- Will you join the team, that is developing military technology?
- Try to add, change or delete some commandments from afterword.
Source:
Comments
from another hand, I can make a conclusion, that there is everything okay with your level of morality and ethic... if so, it is really great new!
Now, I even have hope and faith in humanity...)
2. I don't think so. Developers should be the ones who is solving problems not making them worse. Also, I couldn't sleep, if I knew that I helped to kill people.
3. You are not a tool! You are a human so follow your conscience.