DRAFT

This living essay is still in draft form. It is a continuation from my Design and Neoliberalism Introduction – I would suggest reading that one first. Please mind the typos, half-finished thoughts, and informal language. If you have comments, please email them to me. The purpose of this essay is for eventual submission to the Design & Neoliberalism Special Issue of Design and Culture. I don’t know if they’ll be interested, but I’ve enjoyed working on it either way 🙂

 

One might in fact speculate that the more intractable and resistant the real world faced by the planner, the greater the need for utopian plans to fill, as it were, the void that would otherwise invite despair.’ – James C. Scott

Setting the stage for Neoliberalism


Historical movements often take shape and gain momentum by reacting against the movements that preceded them, so a useful first step in understanding the underlying philosophical elements that seeded both early neoliberal ideas and 20th century design in the late 1930’s is to consider what came before.

In his analysis of state development in the late nineteenth and early twentieth centuries, political scientist and anthropologist James C. Scott 1)Seeing Like a State, 1998 describes a particular fascination with the desire to see ordered, scientific wholes. Scientific naturalism was of rising prominence, and many movements around the world aspired to bring the clean structured understanding of the natural sciences into the realm of the social. He speculates that many decisions of statecraft were driven by the particularly pernicious combination of three elements:

  1. The aspiration to the administrative ordering of nature and society raised to a far more comprehensive and ambitious level.
  2. The unrestrained use of the power of the modern state as an instrument for achieving these designs.
  3. A weakened or prostrate civil society that lacks the capacity to resist these plans.

He uses the term “High modernism” to describe the drive toward this comprehensive, administrative order.

“What is high modernism, then? It is best conceived as a strong (one might even say muscle-bound) version of the beliefs in scientific and technical progress that were associated with industrialization in Western Europe and in North America from roughly 1830 until World War I. At its center was a supreme self-confidence about continued linear progress, the development of scientific and technical knowledge, the expansion of production, the rational design of social order, the growing satisfaction of human needs, and, not least, an increasing control over nature (including human nature) commensurate with scientific understanding of natural laws.” (Scott, 1998: Ch. 4)

Scott goes on to describe the justification for this particular era as partially seeded by an adaptation of enlightenment-style scientific reasoning as flipping from a descriptive to a prescriptive understanding. In this new era, scientifically designed schemes for production and social life would be superior to the accidental, irrational decision making of historical practice. This understanding became a justification for the enlightened elite who possessed scientific knowledge and the power to bring order to the world to do so.

These theories were nothing new in the early 20th century, but they came to their zeniths in many ways through the establishment of more ambitious forms of statecraft – most notably through the celebration of Keynsian economics and New Deal politics in the US, the rise of bolshevism in what would become the U.S.S.R, and fascism in the European states. While these political forms varied dramatically in their details, they shared a common faith in the imposition of political and economic order through centralized bodies and beliefs in the universality of common standards that could be understood through the scientific method and imposed in a top-down manner.

Design in this time period


“Design,” as the practice and industry that we consider today, began to take form during this time period of the early 20th century, and we can look to a number of movements in art and design that philosophically mirrored “high modernist” ideas.

It was primarily through discussion of “form” that the shift toward a rational, scientific understanding of the world was considered. Prior to this era, aesthetic considerations in the commercial realms of both product design and architecture such as Beaux Arts or Art Nouveax favored elaborate decorative ornamentation.

Around the turn of the century, these styles began to be challenged by ‘modernist’ movements. Adolf Loos, friend of Edward Sullivan, famously criticized ornamentation as “criminal” for its’ propensity to simply invent trends for the sake of marketing. As the world moved on from WWI and looked to rebuild, a new spirit was needed for the aesthetics of the new era they wanted to create.

Designers and architects of this area found this spirit in “Function.” As early as 1896, American architect Edward Sullivan had written his now famous claim that form in nature follows not decorative appeal, but from function.

Whether it be the sweeping eagle in his flight, or the open apple-blossom, the toiling work-horse, the blithe swan, the branching oak, the winding stream at its base, the drifting clouds, over all the coursing sun, form ever follows function, and this is the law. 2)…Where function does not change, form does not change. The granite rocks, the ever-brooding hills, remain for ages; the lightning lives, comes into shape, and dies, in a twinkling.
It is the pervading law of all things organic and inorganic, of all things physical and metaphysical, of all things human and all things superhuman, of all true manifestations of the head, of the heart, of the soul, that the life is recognizable in its expression, that form ever follows function. This is the law.
” – Edward Sullivan, 1896

As the 20th century progressed, the alignment of an object’s form to its “function” could serve as the barometer for good design. Function was the pure essence of an artifact’s being – anything that did not contribute to an artifact’s function should be questioned.

Central to this understanding was the metaphor of the machine of the Industrial Age. The simplicity and purity of the an airplane’s wing or a motor’s cylinder were products of scientists’ mastery over Newtonian physics to the extent that engineers could produce these artifacts with a high level of precision and consistency. There was an amount of complexity to consider – heat, air pressure, and other factors needed to be taken into account – but the variables that needed to be understood to craft functioning machines could be worked out more-or-less by a small group of engineers working in their labs.

An added benefit of the Newtonian machine as central metaphor was its universality. Mechanical physics was mechanical physics – these laws worked anywhere in the world 3)There were, of course, more experimental ideas of physics, but in terms of pragmatic mechanics, the tolerances of a production machine could overcome most of this viability. Few people needed to pilot their Model T to the top of Mount Everest. An artifact designed based on the laws of function could have an aesthetic value independent of culture – and could truly be worthy of the moniker “International Style.”

“Function” could be either a simple understanding of function in the absolute sense (the function of an airplane is to fly) or the orientation of the function in relation to a human (the function of an airplane is too transport humans in the air) – coming to an understanding of the orientation of this logic is fundamental to understanding the shift that would occur over the course of the 20th century.

An older understanding of “Form follows Function,” represented by Le Corbusier’s sketch, is an argument against ornamentation.

Notably absent in design discussions of this era was the effect that a designed artifact would have on the humans using them. “Function” was defined by the mechanical task at hand: the function of the airplane was to fly, the function of the automobile was to move. To the extent that humans were considered at all, they were mechanical measurements of mass and function just like everything else.

To the extent that humans were anything more, they could be considered problematic. – humans tended to be disorganized and their individual wills could not be trusted. The goal was to push individual humans toward more scientific rationalism, and a good way to do this was through the design and distribution of rationally designed artifacts. Perhaps the quintessential messenger of this new idea was Swiss Architect and visionary Le Courbusier. Corbu left no question about his understanding from which truth in design emerged.

” The despot is not a man. It is the Plan. The correct, realistic, exact plan, the one that will provide your solution once the problem has been posited clearly, in its entirety, in its indispensable harmony. This plan has been drawn up well away from the frenzy in the mayor’s office or the town hall, from the cries of the electorate or the laments of society’s victims. It has been drawn up by serene and lucid minds. It has taken account of nothing but human truths. It has ignored all current regulations, all existing usages, and channels. It has not considered whether or not it could be carried out with the constitution now in force. It is a biological creation destined for human beings and capable of realization by modern techniques.”- Le Courbusier 4)Quote taken from James Scott’s “Seeing Like a State

As neoliberalism entered the philosophical stage, many of these ideas would be challenged. Industrial designers like Henry Dreyfuss and corporations like Herman Miller Co. turned toward studying individual humans with greater depth first in an ergonomic sense and, eventually, in a cognitive sense as well. As these focuses shifted, the measure of value for a designed artifact would shift from an a simple appreciation of an object’s aesthetic as its representation of mechanical function toward a user or customer-centric orientation. This shift, however, was not simply a trend in design, but design’s interpretation of a broader shift in the way the human was seen to relate to society.

Neoliberalism challenges centralized structure.


It was God’s prerogative to make a world suitable to His governance. Men govern a world already in being, and their controls may best be described as interventions and interferences, as interpositions and interruptions, in a process that as a whole transcends their power and their understanding… The actual situation…is the result of a moving equilibrium among a virtually infinite number of mutually dependent variables.” – Walter Lippman, The Good Society, 1938″

Scholars of Neoliberalism often cite the so-called “Walter Lippman Colloquium” held in Paris in 1938 as the foundation of the term in its modern sense. Lippman himself was the author of The Good Society, a book published in 1937 which celebrated liberalism over other political ideologies popular at the time 5)Tucker, 2017. Notable attendants at the colloquium were Ludwig von Mises and Fredrick Hayek – two influential thinkers that would go on to outline and lay the groundwork for a movement that would grow to peak in the 1980’s and 1990’s.

As a whole, the movement built on the ideas of classical liberalism that had emerged primarily as anglo-saxon ideas of the 18th century Hobbes, Locke, and Smith 6)check, cite which were heavily influenced by Continental European (in particular, French) thinking and revolutions in both The United States (1776) and France (1989-1799). These movements re-laid the groundwork for philosophical concepts that underlie the modern understanding of the relationship between humans and nature, humans and each other, and humans and their governing institutions.

The zealousness of neoliberal ideas between the middle and late 20th century may have formed in large part as a contrasting ideal to the Anglo-Saxon world’s growing rivals. In Neoliberalism’s early years, Toryism, Fascism, and Socialism were on the rise in Europe – trends that Lippman specifically sought to challenge in his book and conference. In the post-war era, Neoliberalism found a new rival in Communism.

Many of the core tenets of Neoliberalism, including an emphasis on the individual, a push toward universal principles (rather than designs), a reduction in the scope of state governance and a celebration of emergent order through marketplace dynamics, can be viewed as a direct challenge to the core values of 20th century centralist ideas.

As the movement matured and gained momentum, an increasing number of institutions grew to support Neoliberal values (in principle if not in name), among them academic institutions such as the Chicago School of Economics and political think tanks such as the Cato Institute, (more… more…).

The relationship between Neoliberal ideals and economic success is hotly contested. Proponents of Neoliberal ideals often present the causal claim that “liberal principles in economics—the ‘free market’—have spread, and have succeeded in producing unprecedented levels of material prosperity, both in industrially developed countries and in countries that had been, at the close of World War II, part of the impoverished Third World” (Fukuyama, 1998, introduction). That the United States grew to such economic dominance the late 20th century at the same time as a shift toward Neoliberal policy appears to support this claim. Other opinions more closely resemble former Singaporean president and geopolitical savant Lee Kuan Yew’s note that the success of the US has been the result of “geopolitical good fortune, an abundance of resources and immigrant energy, a generous flow of capital and technology from Europe, and two wide oceans that kept conflicts of the world away from American shores.” (Allison, 2013)

We will leave the direction of causality to the geopolitical scientists – for our sake, it will suffice to note that a substantial part of design discourse in the late 20th and early 21st century has grown within “The Garden” of Neoliberalism’s rise – an idea we will explore below.

Before turning to the environment, however, it is useful to understand at a deeper level the philosophical rationale and values that underlie Neoliberal ideas. While acting proponents of Neoliberalism focused primarily on the translation of these ideals into political and economic systems, we will also see how these values have shaped design philosophy and practice in this era.

While there are many flavors to neoliberalism, most agree with a few philosophical ideas that evolved from older liberal ideas and and took on economic and public policy implications in the contemporary era. In the context of design, we should consider two effects of Neoliberalism which I refer to as “The Seeds” and “The Garden.”

The Seeds

The ideals and philosophical constructs that Neoliberalism shares with design thinking and practice.

The Garden

The environment that Neoliberalism as a dominant economic and political idea constructed in the late 20th century design would grow in.

The Seeds


 

The core of neoliberal philosophy re-orients the focus of human systems around the agency of the individual. Neoliberals argued for this orientation most notably in two areas: individualism as a moral political ideology, and individualism as a way toward economic efficiency.

The neoliberal political orientation was in many ways a thread from earlier liberal movements that laid the groundwork for the concepts of “citizen” and “private property” in western Europe several centuries before. 7)Insert some Scott in here, and hopefully some Charles Taylor.

For academics like Hayek, systems of distributed agency such as markets provided more than a simple mechanism for the allocation of capital and resources – it was itself the primary means for which knowledge of a complex world could be generated and most efficiently used by humans. In his 1945 essay The Use of Knowledge in Society, Hayek argued that because the world was filled with incalculable amounts of dispersed and often contradictory information, it was only at the local level where reality could be seen and the best decisions made. A centralized authority could never know what was best, because it could never understand more (let alone enact policy better) than a particular person on the ground in a particular situation.

This humility set the stage for neoliberals argue that the best possible outcomes for the individual as well as the society as a whole must place as much power as possible with the individual. Any over-arching structures that limit the ability of the individual make the system both coercive and inefficient.

There are three overarching ideas of neoliberalism that fundamentally shaped the way design is practiced today. The first is a new understanding of where information comes from: neoliberals believe that information is distributed in the world, and only agents in context are suited to efficiently consume it. The second is a notion of order: because information is distributed in the world, order comes not from a centralized design, but emerges as the conglomerate of agents acting in local situations with limited but contextual knowledge coordinating with other agents in close proximity. Third is a position on control systems that regulate the roles of these agents in relation to centralized authority: because order tends to “emerge,” order cannot be designed by a central planner. Control systems should, therefore, simply consist of minimal regulatory principles that can be exercised in a consisted, predictable way across the system. With a deeper understanding of these three principles, we can see how they have influenced design practices of the modern era.

 

Distributed, Contextual Knowledge


Today it is almost heresy to suggest that scientific knowledge is not the sum of all knowledge. But a little reflection will show that there is beyond question a body of very important but unorganized knowledge which cannot possibly be called scientific in the sense of knowledge of general rules: the knowledge of the particular circumstances of time and place. – F. A. Hayek, 1945

Neoliberal theorists were highly critical of the confidence their opponents gained from the high-modernist conceptions of scientific knowledge. It wasn’t that neoliberals abandoned scientific universalism – in some sense, they took the concept to new levels. Their critique was in the human attempt to apply “scientific” principles to social engineering with the expectation that such a narrow understanding could result in a positive outcome.

The problem was one of complexity. By the middle of the 20th century, scientific “laws” found in in natural sciences such as chemistry or physics could be consistently enough relied on to effectively engineer machinery with increasing degrees of complexity and precision. The centralized collection of knowledge of natural sciences meant that the same steam engine created in the UK could run in Tokyo or Rio de Janeiro in exactly the same way. The same mechanics applied, and the same engineer could operate it in any environment.

According to Hayek, most disputes in economic theory and economic policy had their common origin in an erroneous attempt to transfer the habits of thought that were developed to deal with phenomena of nature to social phenomena 8)The Use of Knowledge in Society…Also, I need to double check this in context to see if it still fits with my argument here. Implicit in this assumption was the notion that – in a way similar to an engineer with a command of physics principles can design a machine – a social engineer would have access to the information needed for the design of effective social or economic policy.

Hayek argued that the no planner could be so smart because the type information that would needed for such designs could never be centralized as so much of it was specific to the particular circumstances of time and place. He noted that:

It is with respect to this that practically every individual has some advantage over all others because he possesses unique information of which beneficial use might be made, but of which use can be made only if the decisions depending on it are left to him or are made with his active cooperation.

Lippman too criticized this idea in his opponents whom he accused of desiring to live under a “providential” state 9)Lippman in particular was responding to fascistic and communistic governments and sympathizers which were growing in popularity at the time. About communism in particular, he argued that one would have to have “providential,” amounts of knowledge to orchestrate such an endeavor.. It was simply impossible to amass all of the information that would be needed for a centralized plan, and even if it were possible, there would be no human that would be able to comprehend it.

The concept of distributed knowledge presents a challenge to those that feel the need for order or the desire for a society to achieve certain ends. Neoliberals, however, provided an answer to this in their metaphor of a new understanding of order, an order that emerged from the complexity of the system itself.

 

Emergent Order: A belief in the Invisible Hand


While neoliberals argued that an order cannot be imposed on a system, they believed order is possible – it simply emerges as the sum total of individual agents acting within local contexts with local knowledge.

This concept takes different forms and metaphors in conversations about economics in particular in the “marketplace” idea. Without a centralized planner to direct the allocation of material resources, a shifting understanding of value (measured in “price”) enables material goods to flow through a system. In a marketplace, an object is assigned a value based on its merits in the eyes of the individuals at present. Neither the item’s cost, nor history is considered, and the value of the object is fundamentally dependent on the assessment in that moment.

Wedged firmly in the ideology of liberals of all stripes is the “invisible hand” of the marketplace which pulls capital and labor toward its most effective ends. Individual actors within the system do not need to see this hand, or know the direction not pulls them in – acting with their own ambitions within their own spheres brings them into a coordinated choreography with the economic system as a whole10)The Anglo fascination with the marketplace has been studied in great detail by academics in recent years. Geret Hofstende, in his book “Software of the Mind” which looks at differences in values between members of different cultures notes the different between Americans and British ideas of the market place and those of even close-by cultures such as the Germans or the French..

Neoliberals extend the marketplace concept to discourse itself with the concept of the “marketplace of ideas.” In this marketplace, any interlocutor can show up and make her case regardless of her background or qualifications. It is the merit of her arguments, proponents say, that should carry the day – not her background.

The concept of emergent order, therefore, displaces the the idea of “universality.” The system as a whole learns to appreciate types of “value” as universal mechanisms, but the value of a unit in this system – be it an artifact, a currency, an hour of someone’s labor, or even one’s own freedom – flexes over time based on the context that that unit is in. There are no universal laws about the value of an object, only universal principles for how that value may be determined in context.

By shifting the idea of value to a contextual understanding – and an understanding that no centralized system could ever really comprehend or execute on – the roles of both the agent and centralized bodies was forced to change. The very understanding of “knowledge” itself needed to change from the simple recording of information into the need for a higher-level of decision making based on principles.

 

Governance through Principles (Not Plans)


While the early 20th century held that it was the role of centralized control systems such as governments to use scientific knowledge and rationality in the pursuit of dominance over nature and human nature, by the middle of the century the conversation had turned a little more cautious. While scientific universalism was here to stay, neoliberal skepticism about whether this rationality could be acted upon required a different approach. Rather than imposing universalism on the system by design, the neoliberal west looked to shifted toward simply maintaining order through universal principles.

Specifically, if the energy and intelligence of the system came from the nodes, it should be the role of the center to empower (not direct or coerce) these nodes to promote a system that would be healthier and more dynamic as a whole.

In these systems, the way nodes are treated becomes paramount. Distributing information to every node is difficult, so minimal, simple, universal rule-sets are preferred to complex regulatory systems. It is also easy for central systems to abuse individual nodes, so setting restrictions on the central power’s authority is generally focused on.

In the political space, this took the form of emphasizing the rights granted to individuals and the limits on central governments. In economics, it meant stripping away barriers that would slow the natural dynamics between individuals in the system.

In business environments, this meant shifting toward systems that gave lower-level employees more autonomy and basing assessments on producing value through results rather than simply following orders and completing tasks. As we will see in design – this means shifting the focus and definition of “quality” of a design to away from universal measured of aesthetic toward whether it is useful, usable, or desirable to a potential user.

In every case, the emphasis on the system shifts from a focus on the alignment of agents in the system to empowerment. This new universalism changed the nature of bodies that impose will: rather than roll out a plan, they need to simply provide the structures or principles / frameworks that keep natural growth healthy.

In Summary


High modern theory functioned like the classical industrial machine. Because physicist had a mastery over basic newtonian physics, mechanical engineers could feel relatively confident in the design of their machines. The society-as-machine paradigm is something, however, that Neoliberals cautioned against. To them, society still functioned according to scientific principles, but unlike simple physics, the principles that underlies social functions were too complex for social engineering to be possible. Society was more more akin to studying complexity, systems theory, or cybernetics, wherein a system cannot be designed because there are too many intractable facts at play.

When Hayek and Lippman first started writing about this theory, it was new and relatively unpopular. As the decades progressed, however, and the Mont Pelerin society spun off more think tanks, entered more universities, and began to produce students, these basic assumptions about the nature of the world began to take hold; by the late 20th century, the neoliberal conception of the world was all-but ubiquitous in the English speaking world in both private enterprise and the political sectors.

It was this influence – and this orientation toward the individual (citizen, customer, user…) that shaped the “garden” that late 20th century design practice would grow in.

The Garden


In addition to supplying the philosophical rationale for design to take on a more human-centered approach, the neoliberal era changed the economic landscape as well.

The most important driver of the methods of production is the motivations behind it. Human societies have seen methods of energizing human labor, from honor and glory cultures thousands of years ago, through the pull of monotheistic religion in the past few hundred years, and on to capital which has become a primary driver most recently.

While the flow of capital can take many forms 11)later in this essay, we will touch on the way modern China direct capital, in the US during the late 20th century, the predominant arrangement and flow of capital was driven by neoliberal theories of the open market. The logic of this game was simple: the goal was a higher return on one’s investment, and to achieve it, an organization must create what will be valued by agents in the marketplace. Corporations in the consumer sector operated on the notion that “the customer is always right” and rushed to produce the goods that consumers would buy.

This new focus on the consumer changed the internal composition of corporations. The field of marketing grew in importance and began to shift away from simple communication about product features, to more nuanced understanding and, in many ways, shaping consumer preference.

The conversation about “branding” became more robust as marketers sought to use expressions, emotions, and archetypes12)For a wonderful breakdown of archetypical branding, see Margret Mark and Carol S. Pearson’s “The Hero and the Outlaw.” to position13)For an understanding of how marketing positions products against competition, see Ries and Trout’s book “Positioning” products against competitors.

These more nuanced battles on store shelves led to a flurry of psychological research as corporations tried to out-pace one another in any way possible.

Core to all marketing strategy, however, was the neoliberal notion that, at the end of the day, what was “good” would be determined by the consumer. Success would ultimately be determined by what moved best off of consumer shelves, but marketing professionals developed ever more elaborate strategies for understanding these processes before hands through the development of trend-watching organizations, surveys, focus groups, and a host of other methods that would source information from the individuals that would ultimately make the decisions.

The importance of “design” as a practice, of course, varied based on the industry. For general consumables, graphic design and packaging were all that was needed and tremendous focus was paid to keeping design fresh. For more complex products, however, often the product needed to sell itself. As this became more important for hard-goods – and in particular as monetization models started to shift during the push into digital technologies – the industry pulled designers further up the value chain into increasingly more strategic roles.

The effect of neoliberal values on late 20th century design practice.


To make the case that late 20th and early 21st century design rested on the same values as neoliberal economic and political policy, we must see whether there is a commonality in practice and theory between these two as contemporary movements.

For a shift in design practice to coincide with the rise and decline of neoliberalism, we would expect to see initial ideas to seed in the industry in the late 1930’s and early 1940’s and develop toward an apex between the late 1980’s and early 2000’s. We would then expect to start to see the initial seeds of around the time of this writing, the early decades of the 21st century (which I explore in greater detail in Part II).

If neoliberalism and modern design theory do, in fact, share values, we can expect to see certain shifts in the way “design” is discussed and practiced over the course of this time period. We would expect to find the energetic desire for a universal aesthetic of the early 20th century replaced with an emphasis on individuals in contextual situations. Specifically, we would see the industry responding to the re-orientation of agency and value around the individual with some specific responses to:

Distributed, Contextual Information
The industry shifting toward new methods of collecting and consuming information in these contexts.

Emergent Order: A belief in the Invisible Hand
This industry of “planners” aligning around the new nuclei of power and value: individual consumers.

Governance through Principles, not plans
An industry evolving toward establishing general rules of practice rather than standard solutions to problems.

In essence, we would expect to see a shift from design being a high-minded, normative practice toward a practice that studies, celebrates, and seeks to articulate the micro interactions of individuals in context and puts the ideals of individual choice and agency at the heart of its practice. This design would seek to serve the whims of the individual rather than shape them.

Before analyzing the effects on the study, process, measures of “quality,” and who is looked to for measuring the quality of design, it is helpful to summarize some key trends in the design industry of the 21st century.

The evolution of design in the late 20th century


The modernist era was driven by the normative application of universal aesthetics consistent with a simplistic understanding of the relationship between scientific (thought of largely through mathematical) rationality. Just as an engineer could master the laws of nature to create ever more complex machines, the designer could master aesthetic principles and, in her studio, apply them to commercial objects.

A shift, however, was inevitable. In the United States, a focus on more free-market capital led to a rise in the corporation as the fundamental patron for design. The need to sell meant that the way the “quality” of a design was measured took a third form: did the consumers of a designed object actually like it? Is the quality of a designed artifact determined by a person’s ability to use it?

“Human Factors” became a more important topic, and designers such as Raye and Charles Eames, employed by the furniture manufacturer, the Herman Miller Corporation, started studying the way humans actually used objects in context. Henry Dreyfuss famously turned his attention to ergonomics writing books such as Designing for People (1955) and The Measure of Man (1960).

By the 1980’s this conception of design’s purpose was gaining traction. Don Norman famously published The Design of Everyday things 14)originally titled “The Psychology of Every Day Things” arguing exhaustibly that the quality of a designed artifact came from a user’s ability to understand it.

By the 1990’s the idea has been taken even further. In addition to the physical attributes of an individual (which could be measured in a lab), and the behavioral psychology of an individual (which could be understood as universal truths about cognition), designers shifted toward an interest in individual understanding that may vary across different social contexts. Design now sought to understand individuals and their mental models at an even deeper level to give them even greater senses of empowerment. “Design Research” came into its own as a topic of study and an official profession.

In the new era, therefore, a new orientation around the “customer” or the “user” grew practices in understanding the individual in physical, cognitive and contextual ways with ever increasing degrees of nuance and granularity. The goal of uncovering and acting on this new, distributed knowledge led to to a design “practice” that saw much fanfare in the corporate of the early 21st century as terms like “Design Thinking” and “Human-Centered Design” entered the collective vernacular.

It is in this late manifestation of design practice that we can see the common themes of neoliberalism fully formed. In particular, we can see the outcomes of a neoliberal worldview across four main areas of design practice:

  • Quality: A shift from understanding the quality of a designed artifact from universal (such as mathematical or geometrical) attributes to contextual characteristics.
  • Qualifier: A shift from understanding the quality of a designed artifact from the educated eye of the specialist to the lay eye of the consumer and, as a result… (Norman, usability, etc..)
  • Study: A shift away from the studio and into the context of use for determining whether an object is well designed. (Design Research / Human Centered Design / psychology, etc…)
  • Process: A design process and method that celebrates a “marketplace of ideas” as the core of it’s intellectual engine. (Design Thinking)

As we look in detail at these four areas, we see that each is well represented by trends in late 20th and early 21st century design practice.

Quality: From Universal to Contextual Characteristics.


Consider the similarities between these two statements:

“Value is not a property of objects or a quality they possess. Although we talk of objects “having value,” we mean that we value them. Value is in the mind of the person contemplating the object, not in the object itself.” (Dr. Madsen Pirie)

and

“It is in the interaction with people that products obtain their meaning: On the basis of what is perceived sensorially… products reveal cues of how to use them, and they reveal their function. Perceived properties will only be of interest to an individual is that are somehow instrumental in fulfilling needs; only in relation to people we can determine what behaviours a product allows for, and what its primary or secondary functions might be.” (Hekkert, 2007, p.4)

In both of these statements – one representative of a Neoliberal mindset and one of a design mindset from the early 2000’s, we see a contextual state of value. Neither an object’s value (Pierie) or purpose (Hekkert) can be measured in an objective, atemporal sense. The root value is projected by humans in-situ, or as London School of Economics scholar Saadi Lahou 15)Installations Theory would put it, in “the Installation.”

It is important to remember that neither of these statements are uttered themselves in a vaccuum – both are a response to (and rejection of) prevailing notions that preceded them. Pirie, in his statement goes on to reject the Marxist theory of supply-side value arguing instead that because value can only be measured within a particular physical and temporal context, that a universal calculation would be impossible. Hekkert, and many of the contributors from which his work is made up of – represents a shift away the more universal aesthetic ideas of design’s predecessors. When designers such as Le Courbusier (and others…) argued for an “International Style,” it was a universalist current that carried the day.

Notably, in neither situation are the authors arguing that a solution couldn’t be universal. Rather, they are arguing for a shift away from universal principles of aesthetic to universal methods of problem solving. Both the neoliberals and designers shift their source of “truth” to the individual – neoliberals through an emphasis on perception and choice, and designers through an increased interest in cognitive psychology and, eventually, understanding of local context (citation needed). They therefore construct the argument that IF a universal truth exists, it must be deduced from a study of the way things play out in the real world – not projected from a normative theory of how the world should work.

For a neoliberal, a good system is one in which the individual has more freedom to pursue her goals regardless of what those goals are. To a late 20th century designer, a good product is one that a user can use to pursue her goals – whatever they may be. In this way, both philosophies position themselves as seeking to promote individual agency by crafting systems and practices that define value in context.

Because they exist in systems that localize agency, the measure of value itself shifts: in both cases, the contextual understanding of value now lies in the individual “customer.”

Qualifier: From Designers to Consumers


The neoliberal shift toward distributed agents as focal points for information processing and agency reconfigured the way humans working in neoliberal societies think about human networks. In this neoliberal environment, these “agents” became “users” or “consumers” of products and commercial organizations shifted to the “customer is always right” mentality that would win in the marketplace.

With this new orientation of value – value not as something that could be generated within the organization but, instead, was determined by customer interaction – new methods needed to be developed for organizations to collect information about, comprehend internally, and contribute to the production of value. This meant a shift toward a richer vocabulary and new models for understanding human behavior.

Understanding both the desires and the behaviors of agents in the world became paramount, and different types of focuses emerged based on different contextual situations.

In the consumer marketplace, the most important measure of value from the prospective of producers tended to be at the point of purchase.

The push toward emotion in branding led to a greater variability of the messages that people would receive. The product qualities of a particular line of soap may not be differentiated at the scientific level, but the shift from an objective, scientific standard (“does this soap effectively clean hair) to a customer centric perspective (Is the right shampoo for me?) created the imperious for a plethora of differentiated graphic and packaging design. To the extent that brands sought to teach customers about their products – the goal was always in increasing customer savvy to set an expectation that the brand would be best able to fulfill. In the end, though, the decision lay with the consumer – the product design must always acquiesce to this understanding.

In the B2C space, it was not the moment of purchase that most mattered, but the productivity of the user. This led to an increased research about how individual humans used space to better understand the way new objects could be crafted to existing habits.

Herman Miller’s revolutionary designs for the modern workplace exemplified this approach. Designers studied the way office workers organized information in context, then designed a modular set of furniture to meet these needs.

The goal was not to change the way people worked, but to shape the artifacts in the environment to compliment existing practices. It was the assumption that the “user,” in the context of her daily environment, new more about the task at hand than the designer. Simply by shaping the material aspects of the environment around the user’s behaviors and mental model, therefore, would improve the user experience.

Herman Miller furniture also did not seek a universal truth for the design of a workplace. Rather, their modular system pushed decision making to the agents in the context (in this case, the companies and employees) to implement furniture as they saw fit.

As the ultimate goals shifted, a new vernacular was needed: rather than a focus on the objective truth of the product, designers sought to understand the “mental model” of the prospective agent that would decide whether or not to engage with it. The term “affordance” describes the perception of how something in the world acts specifically from the eyes of the user. Unlike previous aesthetic vernacular which may have concentrated more on universal mathematical principles of visual cues, “affordance” is directly related to an object’s function. However, this is an important shift from simple discussions of “function,” which can be more or less tested objectively. Affordance is, therefore, a measure of whether the function of an object is implied to the perceiver within the logic of her mental model 16)Similar is the discussion of “display, control compatibility.” Whether an object’s “display” or “control” are “compatible” is ultimately a decision that must be made in the mind of the user.. “Consistency” as a design principle survives not because it is good in a philosophical sense, but because individuals are more likely to remember consistent patterns and more quickly understand how something may function. “Easy to use,” in itself became a product benefit with which to use customers.

In the case of all of these terms, designers could no longer have confidence their intuitions and training to determine a “good” design from a poor one. Ultimately, everything needed to be tested with users or customers before large investments in production could be made.

Again, the ultimate measure of value was not the aesthetic of theoretical beauty of the objects produced – it was whether they increased the comfort and utility in the eyes of the people who used them. This, again, is a direct reflection of the neoliberal understanding: the most useful solutions empower the agents to work within the situations that they understand best. Ultimately, it is the decision of whether or not to adopt a solution lies with the agent, but modular solutions give the agents more power and flexibility, and are thus preferred in the market.

The best method for proposing a solution, as the Herman Miller designers demonstrated, was to spend time in the local context.This need led to dramatic shifts in where design research would occur and how the design process should be structured. It is this notion that “design” as an industry would solidify around during the latter part of the 20th century.

Study: From the Studio to the Context of Use


Both the industries of marketing and design turned toward understanding the mental models of consumers as the core source of truth and, by consequence, developed methods for getting consumer insight earlier in the production process to mitigate risk.

As an industry, however, design arguably took this idea much further. Supported by an increased production of psychology studies, many in the design industry came to separate the “reflective” from the “behavioral” aspects of human cognition 17)Don Norman, “Emotional Design.”. The “reflective” 18)or, as psychologist Daniel Kahneman would later describe as “System 1” aspects of psychology referred to the rational cognition and narrative understanding that users had about how the world fit together. This tended to be sufficient for understanding broad goals that customers had, and the types of narratives that would influence buying decisions.

“Behavioral” cognition, however, could only be recalled by a subject in context. This cluster of activities relied on the older habit-forming “reptilian” areas of the brain19)for more on this, see Charles Duhigg’s The Power of Habit which managed sensory perception at a much richer level than just rational cognition. Writers such as Andy Clarke discussed more “distributed” forms of cognition arguing, essentially, that the “knowledge” a human possessed did not simply reside in the brain. Rather, as humans use technology – such as a pencil and a piece of paper – to remember something, but keep these artifacts within their periphery, they are, in fact, acting as “natural born cyborgs.”

With this new idea of cognition – cognition distributed into familiar environments rather than simply residing within a person’s head – it is impossible to study a person in a lab or even to draw solid universal rules about the structure of mental models. The information is in context – and a designer can only generate a possible solution by understanding the way this information flows in-situ.

This understanding opened the floodgates for a new type of research – one that came to reside under the broad umbrella of “design research.” Many of these methods, such as “contextual inquiry,” “diary studies,” and “shop-alongs”specifically targeted ways to watch user behavior and get feedback “in-context.”

This also brought a new emphasis to the “prototyping” phase of the traditional design process. Prototypes wouldn’t simply be for designers, engineers, and business people to align behind the specifications or functionality, but should be tested with user and – when possible – in context.

The most radical transition were the introduction of “participatory design” or “co-creation” practices. This brought not only the measure of “quality” to the user, but the construction 20)at least, in prototype form to the user. The position of “designer”shifted to one of moderator and finalizer of decisions ultimately made (or, at least fundamentally directed by) users or customers.

Process: From Experts to Design Thinking


…this one is more complicated, coming soon…

Conclusion


The neoliberalization of the designer mindset and the context in which design is practice fundamentally changed the way design is practiced. As mindsets shifted from an emphasis on the expression of universal ideals imposed on a system toward the empowerment of agents within the system, designers positioned themselves as the conduit with which agent decisions could be understood and channeled into production.

So long as the agency of the individual is a top priority and the individual is seen as the master of information in context, we can expect the practice of design to center around the user.

As we turn to Part II, however, we see that this fundamental focus is simply a relic of its era. As we look toward a shifting philosophical undercurrent in the West and commercial (and government-influenced) enterprises in the East working their way toward more strategic positions in the value chain21)in other words, positions in the value chain where it becomes important to develop internal design competencies, we can expect the practice of design to again shift. In many ways, this transition has already begun.

Part II: (coming soon)

Footnotes   [ + ]

1. Seeing Like a State, 1998
2. …Where function does not change, form does not change. The granite rocks, the ever-brooding hills, remain for ages; the lightning lives, comes into shape, and dies, in a twinkling.
It is the pervading law of all things organic and inorganic, of all things physical and metaphysical, of all things human and all things superhuman, of all true manifestations of the head, of the heart, of the soul, that the life is recognizable in its expression, that form ever follows function. This is the law.
3. There were, of course, more experimental ideas of physics, but in terms of pragmatic mechanics, the tolerances of a production machine could overcome most of this viability. Few people needed to pilot their Model T to the top of Mount Everest
4. Quote taken from James Scott’s “Seeing Like a State
5. Tucker, 2017
6. check, cite
7. Insert some Scott in here, and hopefully some Charles Taylor
8. The Use of Knowledge in Society…Also, I need to double check this in context to see if it still fits with my argument here
9. Lippman in particular was responding to fascistic and communistic governments and sympathizers which were growing in popularity at the time. About communism in particular, he argued that one would have to have “providential,” amounts of knowledge to orchestrate such an endeavor.
10. The Anglo fascination with the marketplace has been studied in great detail by academics in recent years. Geret Hofstende, in his book “Software of the Mind” which looks at differences in values between members of different cultures notes the different between Americans and British ideas of the market place and those of even close-by cultures such as the Germans or the French.
11. later in this essay, we will touch on the way modern China direct capital
12. For a wonderful breakdown of archetypical branding, see Margret Mark and Carol S. Pearson’s “The Hero and the Outlaw.”
13. For an understanding of how marketing positions products against competition, see Ries and Trout’s book “Positioning”
14. originally titled “The Psychology of Every Day Things”
15. Installations Theory
16. Similar is the discussion of “display, control compatibility.” Whether an object’s “display” or “control” are “compatible” is ultimately a decision that must be made in the mind of the user.
17. Don Norman, “Emotional Design.”
18. or, as psychologist Daniel Kahneman would later describe as “System 1”
19. for more on this, see Charles Duhigg’s The Power of Habit
20. at least, in prototype form
21. in other words, positions in the value chain where it becomes important to develop internal design competencies