A Case Study of Palantir Technologies Through the Lens of Justice Sustainability Design
Palantir is an American technology company that specializes in providing platforms for private companies and governments to mine and aggregate their own data and enable better decision making. The company was founded in 2003 by Peter Thiel and Alex Karp who met at Stanford Law School. Palantir was able to capitalize on a flood of defence spending after the September 11th attacks on the World Trade Centre, receiving early investments from In-Q-Tel which is the C.I.A.’s venture capital arm (MacMillan, 2019). The companies largest customers are governments of Western countries, they offer two different platforms: Foundry is their commercial application that is sold to the private sector, Gotham is their government level platform that is used in national security and defence. The word Palantir comes from an object in JR Tolkien’s The Lord of The Rings, in the films the Palantir is the seeing stone that is used to look into the past and future (Steinberger, 2020). Palantir is a company that has no shortage of controversies, with the highest profile incidents being their involvement in the Cambridge Analytica scandal and their work with I.C.E in the United States (Navaellier, 2021). This paper will use Palantir as a case study of a company that is a repeat offender of Justice Sustainability Design practices, looking at examples of how this company operates and makes decisions through frameworks and readings relating to Justice Sustainability Design.
Palantir’s technology is used in a multitude of ways across several different sectors. The less insidious examples of how Palantir is used would be for logistics with the US Army, supply cain optimization with Airbus, food and resource distribution with the United Nations, and money laundering investigations with Credit Suisse (Steinberger, 2020). Alex Carp the CEO admits that “every technology is dangerous, even ours” (Steinberger, 2020). The company and its co-founder Peter Thiel have become a sort of pariah throughout Silicon Valley due to Thiel’s support of Donald Trump and his libertarian political views. The companies opposition with the politics of Silicon Valley progressed far enough for them to relocate their headquarters to Denver, Colorado (Waldman, 2019). When Google became embroiled in controversy over Project Maven which was to be a partnership between Google and the Pentagon to develop autonomous weapons systems, Google ended up backing out of the project. Thiel criticized Google as being treasonous and infiltrated by the Chinese government, Palantir later took over the project (MacMillan, 2019). The ethos of Palantir is very much along the lines of “We just build the tools” without spending much time taking into consideration adverse consequences or the unintended effects their tech has on the people that are on the receiving end of it. An internal privacy and civil liberties team is guided by the question “Do I want to live in the kind of world that the technology we’re building would enable?” (MacMillan, 2019). Karp believes that it is the job of governments, not Silicon Valley to answer the difficult questions about how technology may be used to surveil citizens (MacMillan, 2019). Palantir does not do business with those that it considers adversaries to the United States and its allies, stating their company was built to support the west (Steinberger, 2020).
A major flaw in the way that Palantir operates is in the way that it views the world. People that work at Palantir believe that many of the worlds problems are simply data integration problems (Steinberger, 2020). This can be understood further by drawing on the ideas of Wicked Problems by Webber. Palantir takes a scientific approach to solving social policy problems which is bound to fail. Social problems are what can be referred to as wicked problems, science deals with tame problems (Webber, 2019. Pg 67). Going back to the industrial age, ideas of planning were dominated by the idea of efficiency, this still pervades government and industry today (Webber, 2019. Pg 70). Scientific problems are very different to social problems, social problems can never really be solved like a math problem, only temporality resolved (Webber, 2019. Pg 72). Solutions to wicked problems can only be good or bad, not true or false (Webber, 2019. Pg 72). As an example of Palantir’s involvement with trying to solve a wicked problem was at the US Mexico border when it’s technology was used by I.C.E. to arrest 443 people that were related to unaccompanied children that had made an illegal border crossing (Franco, 2020). A social problem arose, which was unaccompanied children, separated from their families trying to cross the US Mexico border. A scientific solution was presented, use data webs to locate relatives of these unaccompanied children then arrest them. I.C.E. has a $91 billion contract with Palantir and describes their technology as “mission critical” (Steinberger, 2020).
Computational thinking sees the world in terms of a series of problems that have computational solutions (Easterbrook, 2014. Pg 235). This way of viewing things has many flaws as it overlooks that not all problems can be expressed using abstractions of computational thinking nor can all problems be reduced to discrete variables that can be solved (Easterbrook, 2014. Pg 236). It is a rationalist view that only sees what can be measured and implies that if it can’t be measured then it doesn’t exist. Palantir is a company that relies heavily on “Big Data.” The Gotham and Foundry platforms gather vast quantities of data in order to draw connections humans might miss. Organizational data is collected and synthesized, reducing siloed data, and formatted so that it can all be viewed in one single platform (Steinberger, 2020). “Messy” data is able to be viewed in tables, graphs, timelines, and heat maps creating a digital panopticon of an organization. With any organization being able to harvest the power of big data through Palantir’s platforms, they are also opening themselves up to its dangers. Big Data implies that the solutions to all our problems can be found through pattern matching and algorithms, rather than improving our understanding of forces that shape behaviour (Easterbrook, 2014. 237).
Palantir is an organization that has been subject to the single action bias on more than one occasion. The single action bias is when decision makers will take one single action about risks they encounter, but are less likely to make a second and a third action to reduce their risk further (Weber, 2006. Pg 115). Decision makers realize or are made aware of a problem, do one small act to correct it, then call it a day without taking more steps to mitigate risk further or question the system that brought up the problem as a whole. Palantir has been called out by the public, policy makers, and their own employees over multiple projects. When the companies relationship with I.C.E. was made public 200 employees sent a letter to Alex Karp expressing their concerns, some small concession were made but as a whole not much changed (Steinberger, 2020). In this case there is a single action bias on behalf of the employees and the organization. After the Cambridge Analytica scandal one “rogue” employee was fired and blamed for the whole ordeal. On other occasions when Palantir has been made aware of problems within their company their single action has been to double down on their stances and change nothing. Karp has stated that they will stand by their government clients when it is convenient and when it is not (Franco, 2020).
Inter-temporal choice is a choice between options whose consequences occur at different points in time. Inter-temporal choice studies show that future outcomes are often undervalued relative to immediate outcomes (Soman, 2005. Pg 348). When individuals are temporally distant form an outcome they interpret it at a high level, when they are temporally close they tend to see the details and focus on low level attributes (Soman, 2005. Pg 356). Inter-temporal choice can explain several unintended consequences that have arose through the use of Palantir technology. In a contract with JPMorgan Chase, Palantir tech was successfully used to track down a leaker within their bank. In order for this to be successful all of JPMorgan’s data had to be fed through Palantir’s platform (Waldman, 2019). After cheers for the technology in finding the leaker, senior executives that had called on using the tech learned that they too were caught in the web of surveillance that they employed. This freaked them out and led them to shut down the program even debating whether they should be filing a security breach with regulators (Waldman, 2019). The executives at JPMorgan were all for the use of this technology until they later became caught in its cross hairs. This example can be used as a metaphor for the company as a whole if we think of Palantir’s involvement with local police forces. At the inception of the company, Palantir engineers were eager to create technology for the military and war on terror for a short term gain, without seeing the long term consequence of this same technology being weaponized and turned on ordinary citizens in the west (Waldman, 2019). The time and distance from the effects of their initial actions were too great to understand.
Critical heuristics aims to involve planners and affected citizens with tools for discussing the implications of problem definitions, systems design, and program evaluations (Ulrich, 1987. Pg 276). Whole systems judgments deals with assumptions about what belongs to the real world to be studied and improved and what is outside of this scope (Ulrich, 1987. Pg 278). Justification break offs deals with what is relevant when justifying implications on affected populations (Ulrich, 1987. Pg 278). Critical heuristics is something that Palantir should employ in their design and decision making process especially on projects that deal with local law enforcement. Operation Laser is a program they are involved in with the L.A.P.D. that aims to identify and deter people likely to commit crime (Waldman, 2019). Data is aggregated from numerous police databases to create a list of chronic offenders, this list is given to patrolling officers with orders to monitory and stop these pre-crime suspects as often as possible (Waldman, 2019). This is essentially a program to terrorize people with a history of crime and continue to treat them as criminals. Operation Laser also creates secondary surveillance networks that create a web of who is related to these pre-crime suspects. Police can see this information as they drive around and automatically scan license plates. Systems like this are just built, with no consideration for who is on the other end of them and without the thought that maybe it could go wrong. No amount of expertise or theoretical knowledge is enough for the expert to justify all their judgements, the expert is no less a layman than are the affected citizens (Ulrich, 1987. Pg 281).
Critical systems thinking interpreted through the writing of Flood and Jackson has three main components; critical awareness, emancipation, and methodological pluralism (Midgley, 1996. Pg 11). There have been critiques of Flood and Jackson’s writings but they still hold relevance when talking about systems today. The ideas of critical awareness is especially valuable when trying to understand a company like Palantir. Critical awareness aims to understand the context of application and possible consequences of the system, and closely examine the assumptions and values entering into actual existing systems designs or any proposal for a new system design (Midgley, 1996. Pg 15). From understanding that Palantir started off by being funded by the C.I.A. and providing technology to fight the war on terror and war in Afghanistan, even having a hand in locating Osama Bin Laden (Steinberger, 2020). This adds a lot of context to the way this company and its systems operate. Tools of war when repurposed to be used for civilian use are sure to have adverse consequences. Assumptions are built into these systems for fighting war that can not be taken out. A condition for emancipation to be achieved is when there is freedom from oppressive power relations (Midgley, 1996. Pg 14). Palantir’s systems have a long way to go in emancipating themselves through programs like Operation Laser. The company admits that it doesn’t police the use of its products but maintains that they have access levels and logs built in (Steinberger, 2020). A scope creep occurs with what was the original goal of these systems to what their real world application is today. Speaking to the Washington Post a former Palantir employee stated that “there is a version of the story where they are the good guys, everyone wants to protect service members from IEDs. Everyone wants to prevent human trafficking. Not everyone can get behind working for I.C.E. to help deport immigrants” (MacMillan, 2019).
Palantir is a for profit business and as of 2020 they are a publicly traded company listed on the New York Stock Exchange. Some investors worry that the company has been involved in so many scandals that one more could easily derail them, treating their stock with great caution (Navalier, 2021). Some investors have elected to stay away from the stock completely out of ethical concerns. The financial value of software is easier to measure than its fairness (Becker, 2019. Pg 40). The more weight we give to certain values like wealth, political influence, or power the bigger the ethical blind spots of the existing values become (Becker, 2019. Pg 40). Through examples previously mentioned one can see that Palantir certainly has quite large ethical blind spots. Artifacts have politics, they embody and enact political and social values (Becker, 2019. Pg 45). Looking at Palantir’s company and platforms as an artefact, political and social values are deeply embedded into their systems. These values tend to skew towards libertarianism with a hostility towards actors they do not see as members of the western world. Underdetermination argues that technological choices are always more than technical, technical choices are relative to a context responding to a social world (Becker, 2019. Pg 45). Technology adapts to its environment introducing biases from society, and through technical codes, cultural values are embodied in technical artefacts (Becker, 2019. Pg 40). In the case of Palantir the organizations cultural values are deeply embedded in their technology and decision making process as an organization. A few more examples of this are; in 2011 when the company was involved in a misinformation campaign to discredit Wikileaks (Steinberger, 2020), and in 2010 when the company proposed that the US Chamber of Commerce run a secret sabotage campaign using the companies technology to plant false information and discredit their liberal opponents (Waldman, 2019).
Looking at Palantir through the lens of Justice Sustainability Design and analyzing their technology and organization through this lens has brought to light many examples of unethical behaviours. From their involvement in the Cambridge Analytica Scandal, being mission critical technology for I.C.E., work with local law enforcement, their contract with JPMorgan chase, and more. There is no shortage of examples of the negative effects that can occur when technology that was developed to fight the war on terror is repurposed and re-targeted on ordinary civilians. Palantir is a concrete example of what happens when every problem is treated as a computational problem and when a company is lead by a group of people who believe they are just making tools and writing code.