28 March 2017
An in-depth look at what A+D has learned from medical history and the relationship between people and the spaces they occupy.
It sounds facetious, but this question isn’t too ridiculously outlandish after all. You see, our creative practitioners across A+D are beginning to look further afield, to equally advanced and tech-embracing practitioners in unrelated sectors. Among these, the medical field in particular is proving to be eye-opening. And it all started more than 50 years ago.
In the late 1960s Alvan Feinstein – professor at the Yale University School of Medicine – published Clinical Judgment. The medical world was never the same again.
Up to that point, the practice of medicine was heavily based on the amorphous concept of ‘clinical reasoning’ – a highly biased and surprisingly unempirical process that lead to physicians exercising decision making without checks or adequate control. Feinstein’s publication brought intense scrutiny levelled at medical decision making, increasing the awareness of weaknesses in what was considered ‘standard best practice’ across the full spectrum of the healthcare industry.
In brief, Feinstein’s approach attempted to anchor policies and procedures not to the current practices or beliefs held by experts, but to (investment in) medical experimentation and the application of those results in practice. Here, ‘evidence-based medicine’ ushered in a new gold-standard that held the capacity to adapt along with the times. Additionally, this capacity to evolve doubly benefitted the practice of healthcare alongside keeping pace with advancements in technology to always putting the patient’s needs (not the practitioner’s values or procedural expertise) first and foremost.
Alongside this paradigm shift occurring in the medical workforce, access to healthcare across the globe skyrocketed. Patient treatment numbers increased, the penetration of first-world health practices into third-world countries was suddenly greater and more accessible than ever before, and as a result we in general became healthier, living longer. As health garnered more and more patient-investment and evidence-based medical teachings entered medical education, a new medical workforce of highly educated younger practitioners began to enter the fray – all extolling the virtues for rigorous R+D, continuous experimentation and strict statistical analysis in practice.
Inevitably, as we began treating health better, our healthcare institutions fell under increasing levels of strain. Last month I reported on the soon-to-be dire situation that these same institutions would undergo in the not too distant future. With the rate of life expectancy and access to health disproportionately outstripping the healthcare facilities that can service them – in design, in capacity, in rapidity – the global A+D community is currently finding itself called upon to answer to an enormous array of stakeholder needs.
At its very core, evidence-based medicine sought to achieve something deceptively simple: understanding the patient’s recovery possibilities more, and the practitioner’s abilities less. And, while A+D is increasingly involved in helping redefine the future of health and its facilities, we’ve begun to demonstrate a remarkable feat of pedagogy: we’ve borrowed medical-thinking practices and incorporated them into design-thinking results.
Seeking to understand the needs of the people we design for in this acutely sensitive environment, A+D frontrunners in this field recognise that hitherto designers, planners, architects and specifiers were not seeking appropriate levels of explicit, empirical data to inform their design processes and decision-making. Not dissimilar to medical practitioners pre-1960s, we rarely sought explicit data that existed beyond our own experience/expertise (as practitioners) or that of our clients (the institution). The end-user – in this case, the patient – lost out. It was hardly surprising, then, that the largest criticism levelled at us throughout the previous three decades has been that A+D became too self-involved and ceased to solve actual social ills – all encapsulated in the catch-cry that ‘design is too important to be left to designers’.
In our post-digital age, a diversity of needs and demands continues to grow for evidence-based approaches to designing our material environments. Whether residential, commercial, institutional or cultural, our creative practitioners are increasingly performing intelligence-gathering activities as the core of a new breed of design thinking. Through employing design processes intended to support existing user behaviours, limitations and needs, we’ve rethought the end-user: someone who is not forced to adapt themselves to the object of design, but adapting the object of design to the parameters and limits of their body. Across the entire playing field of A+D, such approaches have engendered a more cost-effective and well-liked end product (not to mention a loyal consumer).
And, this is relatively – in fact, surprisingly – a recent phenomenon. In 2008 a landmark systematic review for the design and health industries conclusively proved that designers are as importantly positioned as doctors for creating material healthcare worlds that actively contribute to patient recovery.
Simply, it’s design that embraces everyone … always.