The impact of technology on culture, politics, and society has never been greater. The principles that determine how technology impacts our lives are critically important to understand, given the amount of time we spend with our gadgets and apps.
As a result of today’s technology, it’s
In the end, technology isn’t an industry, but rather a means of changing the culture and economics of the existing systems and institutions that are currently in place. It’s difficult to grasp this if we limit our view of technology to the things we buy. However, technology extends far beyond the devices we carry in our pockets, and if we hope to have an impact on the people who design and build it, we must be aware of some major social shifts that have taken place in recent years.
Even those of us who have been working in the tech industry for a long time may overlook the underlying forces that shape its impact. Using these fundamental principles, we can better comprehend the role of technology in contemporary culture.
Is there anything else I should tell you?
When it comes to tech, it’s not neutral.
Every button, link, and glowing icon we see reflects the values of the technology creators. This is an important fact that everyone should be aware of when using apps and services. The privacy, security, and even civil rights of software users can be significantly impacted by the design, technical architecture, and business model decisions made by software developers. As a result of software that encourages us to take square photos instead of rectangular ones, or to put an always-on microphone in our living rooms, or to be reachable by our bosses at any time, our behaviours and our lives are altered.
When we use new technologies, our lives change in accordance with the priorities and preferences of the people who created them.
Technology is not a given
Consumer technology is portrayed in popular culture as a never-ending upward progression that improves life for everyone. For the most part, improvements in areas like usability or design come at the cost of privacy and security, especially when it comes to new tech. There are times when new technology improves the lives of some people while making others worse. It is crucial to remember that just because a technology is “better,” it does not automatically mean that it will be widely adopted or that it will spur improvements in other, more widely used technologies.
A lot like evolution in the biological world, technological advancements have many dead-ends or regressions or uneven trade-off, even if we see overall progress.
People in technology, on the whole, have the best of intentions
We don’t have to believe that most people who create technology are “bad” to be sceptical and critical of modern tech products and companies. As someone who has met many people in the tech industry, I can attest to the fact that they truly believe that they can make the world a better place through their work. Those who create new technologies do so with a sincere desire to make a difference. However, those who create technology must understand that even if their work is well-intentioned, they are still responsible for its negative consequences, no matter how well-intentioned they may be.
If we acknowledge that most tech workers have good intentions, we can follow through on those intentions and reduce the influence of those who don’t have good intentions, and we can make sure the stereotype of the thoughtless tech bro doesn’t overshadow the impact that the majority of thoughtful, conscientious people can make. If we want to effectively hold everyone responsible for the technology they create, we must believe that there is a good intention underlying most tech efforts.
There is a lack of documentation and understanding of technology’s history
There are a lot of people who learn how to create technology, and they can usually find out how their favourite programming language or device was created, but it’s almost impossible to find out why certain technologies thrived or what happened to those that didn’t. Despite the fact that many of the pioneers of the computing revolution are still alive and working today, it is common to find that technology history as recent as a few years ago has already been erased. If other apps failed, what was it about yours that made it so popular? What previous attempts to develop such apps have failed? Do you know of any issues that those apps had, or that they had caused? Were any of the people who helped shape the myths surrounding today’s most powerful tech companies left out of the narrative?
For the sake of constructing a narrative of smooth and seamless technological progress, all of these questions are glossed over or answered incorrectly. The problem isn’t unique to technology; nearly every industry can point to similar challenges. It’s possible, however, that today’s tech creators will be unable to learn from those who came before them because of this ahistorical view of the world of technology.
In the vast majority of technical schools, there is no emphasis on ethics.
Law and medicine are two examples of mature fields where centuries of knowledge are integrated into the curriculum, with explicit requirements for ethical education. It’s not as if this doesn’t happen; we can see unethical people in positions of power who attended top business schools that boast about their lauded ethics programmes. However, this basic familiarity with ethical concerns allows these fields to have informed discussions about ethics. Furthermore, it provides a solid foundation for those who strive to do the right thing and perform their duties in an ethical manner.
There had been little progress in raising the expectation that ethical education be incorporated into technical training until the recent backlash against some of the worst excesses of the tech world. Only a handful of continuing education programmes are devoted to enhancing the ethical knowledge of those already in the workforce; the majority of programmes are focused on learning new technical skills. To think that bringing computer scientists and liberal arts majors together will solve this problem is overly simplistic; this is not a silver bullet solution. In order to maintain their current level of public support, technologists will need to quickly brush up on their ethical concerns.
It’s not uncommon for technology to be built with a surprising lack of regard for its users
As society has grown more and more appreciative of the tech industry over the past few decades, it has often resulted in treating the people who create it as indestructible. Even if they have no prior experience in these fields, tech creators are now routinely regarded as authorities in a wide range of fields, including media, labour, transportation, infrastructure, and political policy. Making an iPhone app does not, however, imply that you have an understanding of a field in which you have not previously worked.
As a result, the best and most thoughtful tech creators get to know the people they’re trying to help in order to ensure that they’re not just “disrupting” established systems but actually helping them. It is not uncommon, however, for new technologies to run roughshod over existing communities because the people who are developing them have the financial and social resources to do so, despite the shortcomings of their approaches. It’s not uncommon for tech developers to overlook the negative effects of flaws in their designs, especially if they aren’t in direct contact with the people who will be affected by those flaws. Adding insult to injury, the lack of diversity in the tech industry means that many of society’s most marginalised groups are underrepresented on the teams responsible for developing the next generation of cutting-edge technology.
It is impossible for a technological breakthrough to be the work of a single genius
For many, the “Eureka!” moment of technology innovation is depicted as a “genius” who comes up with a new idea in their dorm or garage. When people like Steve Jobs are given credit for “inventing” the iPhone, it perpetuates the myth that only one person did so, when in fact it was the work of many. When it comes to the development of technology, it is always influenced by the values and insights of the community in which its creators are located, and almost every breakthrough moment is preceded by years or decades of others trying to create similar products.
Those lone geniuses depicted in media are rarely from backgrounds as diverse as those of people in real communities, and the “lone creator” myth exacerbates the exclusion problems that plague the tech industry as a whole. In spite of the fact that media outlets and educational institutions may benefit from the opportunity to award or recognise individuals, the real creation stories are complex and involve a wide range of people. As a result, we should be extremely wary of any stories that claim otherwise.
The majority of technology does not come from or is created by startups
Startups employ only about 15% of all programmers, and in many large tech firms, the majority of employees aren’t even programmers. Because of this, the focus on programmers’ habits and cultures at well-known startups distorts the public’s perception of technology. Contrary to popular belief, most technology creators are employed by non-traditional organisations or institutions.
Aside from the fact that there are a lot of smaller tech companies that specialise in creating custom software, websites, and mobile apps, many of the most talented programmers prefer to work for these smaller companies because of the unique culture and challenges they offer. Even though startups make up only a small percentage of the overall tech industry, we shouldn’t allow their extreme culture to change the way in which we think about the technology industry as a whole.
Almost all of the world’s largest tech companies rely on one of three ways to generate revenue.
If you want to understand why technology works the way it does, you need to know how tech companies make money.
Advertising: Nearly all of Google and Facebook’s revenue comes from the sale of your personal data to third-party advertisers. Advertising companies have a strong incentive to push you toward sites or apps that show more ads from these platforms through search results and social feeds. A business model that relies on surveillance is striking because it’s the one that most consumer internet companies employ.
To get money from other large companies that require business software but are willing to pay a premium if it is easy to manage and easy to lock down how employees use it, some of the larger (generally more boring) tech companies like Microsoft, Oracle, and Salesforce exist. Most of this technology isn’t fun to use because the people who buy it are so focused on controlling and monitoring their employees, but the companies that make it are some of the most successful in the tech industry.
There are a lot of companies out there that would prefer that you pay them directly rather than through a third-party retailer like Amazon or Apple. Big Business, on the other hand, does not use Amazon’s Web Services. When you buy an iPhone, Kindle, or Spotify subscription, you know exactly what you’re getting. Because this business model doesn’t rely on advertising or give up control of your purchasing decisions to your employer, these companies tend to be the ones where customers have the most sway.
It’s done. Almost every tech company is trying to achieve one of these three goals, and you can see why they make their decisions by looking at how these three business models connect.
The economic model of large corporations skews the entire tech industry
A simple formula is followed by the world’s largest tech firms today:
Create a product that appeals to a large audience and changes the market.
Begin to receive funding from venture capitalists
Even if it means losing a lot of money for a short period of time, try to build a large user base as quickly as possible.
Figure out how to turn that huge audience into a business worth enough to give investors an enormous return
Begin aggressively battling (or acquiring) other market competitors.
Traditional growth companies, on the other hand, typically begin as small businesses and grow primarily by attracting customers who directly pay for their products or services. This new model allows companies to grow much larger and much faster than those that had to rely on revenue growth from paying customers to expand. In addition, these new companies are less accountable to the markets they’re entering because they’re prioritising the short-term interests of their investors over the long-term interests of their customers or members of the community.
Companies without venture capital funding may find it nearly impossible to compete with such a widespread business strategy. It’s impossible for a typical business that relies on customers to lose so much money for so long. It’s not a level playing field, which often means that companies are stuck being either little indie efforts or giant monstrous behemoths, with very little in between. The final product resembles the film industry, which consists primarily of small independent arthouse productions and blockbuster superhero movies.
Another major expense for these large new technology firms is… Trying to find coders. Most of their money is spent on hiring and retaining the programmers who will build their new technology platforms. ‘ The vast sums of money invested in the company’s founders and investors are rarely used for the benefit of the community or to build equity for anyone else. If you want to build an extremely valuable business, you don’t have to worry about creating a tonne of jobs for a variety of people.
Fashion and function go hand in hand in the world of technology
To the uninitiated, the process of developing apps and devices appears to be a logical one in which engineers select the most advanced and appropriate technologies for the job at hand. A programmer’s or a manager’s whims, or simply what’s in style, can influence the choice of programming languages and toolkits. When it comes to technology, fads and trends can have an impact on everything from how meetings are held to how products are created.
The people who create technology are influenced by social factors as well as an objective assessment of technical merit when they choose between new and familiar technologies. While many companies like to boast about how ambitious or cutting-edge their new technologies are, that doesn’t mean that they provide more value for regular users, especially when new technologies invariably come with new bugs and unexpected side effects.
No institution has the authority to stop the misuse of technology
If a company does something wrong or exploits consumers, journalists will investigate and criticise their actions. This is the case in most industries. Lawmakers at the local, state, governmental, or international level can then take action against the companies if the abuses continue and become serious enough.
A large portion of today’s media in the tech industry focuses on product launches and updates rather than social impacts, with reporters who do cover these topics often relegated to being published alongside reviews of new phones rather than prominently featured in business or culture stories. Since tech companies have become so powerful and wealthy, this has begun to change. But the culture within media companies still limits coverage. As a rule, business reporters have a lot of clout in major media outlets, but they’re often clueless when it comes to the fundamentals of technology. Tech reporters who have a better understanding of how technology affects culture are more likely to focus on product announcements than broader civic or social issues.
In light of the fact that regulators and elected officials often boast about their lack of technological literacy, the situation is even more dire. Politicians who can’t even set up an app on their smartphones make it impossible to regulate technology effectively or to assign legal responsibility when tech creators break the law. Legislators lag far behind the state of the art when it comes to enacting laws that address the new challenges that technology presents to society.
Tech companies often operate as if they are completely unregulated, and the consequences of that reality usually fall on those outside of tech without the corrective force of journalistic and legislative accountability. It gets worse: Activists using tried-and-true methods like boycotts or protests are often rendered ineffective by the indirect business model of giant tech companies, which can rely on advertising or surveillance (“gathering user data”) or venture capital investment to keep operating even if activists are successful at identifying problems.
One of the biggest problems in technology today is the absence of systems for holding people accountable.
We can improve technology by gaining a better understanding of these concepts.
What if everything is so complicated and many important points about technology aren’t obvious? No.
As soon as we understand how technology is shaped, we can begin to influence its development. In order to encourage programmers to collectively advocate for ethical and social advances from their employers, we need to know that the biggest cost of the tech giants is attracting and hiring programmers. Investing in companies with unethical practises raises the stakes for those investors, who are known to be sensitive to market risk.
While many in the tech industry have good intentions but lack the historical or cultural context to ensure their impact is as good as their intentions, we can ensure that they receive the education they need to prevent harm before it occurs..
It’s hard for many of us, whether we’re creators of technology or just fans of the ways it enhances our lives, to accept the many negative effects some of these same technologies are having on society. If we begin by establishing a common set of principles that help us understand how technology works, we can begin to address some of its most pressing issues.