Informing Explorations On Built Form.

PERSPECTIVES
Bookend PolymathsPerspective2024

As the challenges of the modern world grow more complex, the perceptual powers of polymaths are no longer luxuries—they are necessities.
In an era dominated by specialization, polymaths - those who master and integrate knowledge across multiple fields - often find their contributions overshadowed or misunderstood. Society’s preference for measurable expertise and immediate results favors specialists, leaving polymaths in the shadows despite their critical role in driving innovation. The suppression of polymathic value stems not from their lack of contribution but from the difficulty in recognizing and quantifying their impact.

While specialists thrive within well-defined metrics of success, polymaths operate in the abstract, connecting disparate ideas and navigating complexity. Their strength lies in synthesis: seeing links between disciplines and applying insights from one area to another. For instance, a polymath who blends neuroscience and education might pioneer new learning technologies, creating outcomes that transcend traditional boundaries. Yet, such achievements are often credited to specialists refining specific elements rather than the polymath whose vision made them possible.

This undervaluation extends to jacks-of-all-trades - generalists whose versatility enables them to adapt to diverse challenges and bridge gaps between specialists. Their big-picture thinking facilitates collaboration, yet their work is often dismissed as superficial. The phrase “jack-of-all-trades, master of none” perpetuates a stigma against breadth, obscuring its role in addressing today’s interconnected challenges.

The neglect of polymaths and generalists arises from societal biases and short-term thinking. Specialists deliver measurable results, but polymaths provide foundational insights, systems, and strategies that foster breakthroughs over time. By focusing on the immediate and tangible, we have overlooked the long-term value of their integrative and visionary contributions.

To recover the potential of polymaths, we must redefine success. Recognizing the power of interdisciplinary thinking and valuing roles that bridge silos can elevate these unseen architects of innovation. As the challenges of the modern world grow more complex, the perceptual powers of polymaths are no longer luxuries—they are necessities. By celebrating breadth alongside depth, we can ensure that polymaths are no longer neglected but empowered to shape the future.
Fixed WingsPerspective2024

Aviation engineering borrows from nature’s archetypes—birds for stable, adaptable flight and fish for streamlined, rapid travel.
The placement of an aircraft’s wings - whether high, midline, or low on the fuselage - reflects critical design differences that parallel the way birds and fish move within their environments. High-wing planes, with wings mounted at the top of the fuselage, resemble the structure of birds, with their wings positioned above their bodies for stability. This configuration naturally suspends the plane’s weight below the wings, creating a pendulum effect that stabilizes the aircraft in turbulent air, making it ideal for rugged terrain, low-speed flight, and reliable handling in variable conditions. High-wing planes are designed to remain steady, much like birds soaring on steady wings above the earth.

Conversely, mid- and low-wing aircraft echo the streamlined bodies of fish, which cut through water with minimal drag, efficiently propelling forward. Without needing to fight gravity, fish stay suspended in water, so their bodies emphasize forward speed and sleek movement. Similarly, the mid- or low-wing design places the wings closer to the fuselage’s center, optimizing aerodynamics for fuel efficiency and speed. These planes, such as commercial jets and fighter aircraft, are balanced for forward momentum and efficiency at high speeds, with minimized drag for smooth, high-altitude travel.

All planes must continuously generate lift to combat gravity in an medium that offers no natural support. High-wing plane designs acknowledge the need to address instability like birds, while mid- and low-wing designs can prioritize designs for flight once in a more stable stratosphere like fish. Together, these placements reveal how aviation engineering borrows from nature’s archetypes—birds for stable, adaptable flight and fish for streamlined, rapid travel—balancing stability and speed in response to gravity’s constant pull. Evolutionarily speaking, it is worth noting that despite some advantages, there are no birds with wings connected to their belly presumably since survival outweighs performance.
Neural FogPerspective2022

At this scale, it is impossible to even describe the rules of the game let alone the method to determine a winner.
A neural network is a three-dimensional structure where each interconnected neuron is a possible path of travel. Interestingly, trying to explain how to best navigate a neural network also provides insight into how these structures are useful tools.

If we pick a random start and end point within the network, many paths connect the two points together as an open ended solution. However if we include additional criteria, the constrained path sets become recognizably unique. For example, we can differentiate them with rules such as the paths must - include defined points, be shorter than a certain distance, avoid certain neurons, be restricted to certain diameters, include a certain number of turns, etc. Once criteria is assigned, these network paths can be interpreted and compared. This process of selection is analogous to decision making in the everyday world such as how we eat a meal. We can critique the choices of quantifiable neural networks with reasonable intelligibility.

The complexity of a neural network problem can also increase exponentially by including multiple rules and as the two points diverge to include additional paths. This complexity can even extend beyond the currently available paths to include future considerations where not all the paths can be definitively described yet. For example, most complex problems also have to take into account outside, uncontrollable forces so some rules must estimate probabilities for paths not yet visible. With multiple competing probability rules, we also have to rank their importance. This approximates the conditions of playing a game of go where an opponent’s moves must be contemplated and where strategy is difficult to critique while underway. Problems are less quantifiable and rely on qualitative interpretations of patterns. While this is more challenging, there are still clear limits to the variables at play since the game has rules and a clear method to determine a winner.

The deepest neural network problems are multifaceted global problems where we are trying to link together multiple qualitative neural networks into a fog of consequence. It requires a simultaneous understanding of multiple scales (forest, tree, leaf) while observing from multiple points of view (forward, beneath, within) and planning for systems of growth (embryo, child, adult). At this scale, it is impossible to even describe the rules of the game let alone the method to determine a winner because we are aiming for a network of improvement. Both the delineation of the solution AND the problem are neural networks and the solution changes according to the constant change in the definition of the problem. These neural networks are impossible without computational tools, if possible at all, as we can only interpret issues sequentially ourselves. Therefore, it is and always will be impossible for us to control the process without being completely dependent on computational tools to plan and reason. At this deepest level, the best we could hope for is to create a system with immeasurable power and good intentions.
Reciprocal DesignPerspective2021

Stress testing along the way would only show a complete failure of the system.
A reciprocal structure is binary in strength but interdependent in assembly. It cannot support any weight until all members are in place but once successfully prepared, the structure is extremely strong. These structures are prepared the way you would fold the flaps of a cardboard box together (to avoid the need for taped joints) - incrementally folding the sides together or working on all sides simultaneously until they are woven together. If one side is accelerated, the remaining sides are strained or cannot fit together.

The preparation of a genetic design is also reciprocal.  For example, the tailorability of the Formid product to human anatomy permits the flattening of data acquired from the hardware (all seats are comparable in movement as bodies are normalized) so that the machine learning algorithm can compare data sets accross a wide variety of anatomical idiosyncracies with ease. As an interdependent design, none of these systems can be prepared without the other, and their interdependency necessitates the progress of the work to be simultaneous prior to a demonstration ready product.

Unfortunately, the evaluation of such a system while it is being designed is not as impressive unless you understand the process. Stress testing along the way would only show a complete failure of the system.  For example, capital investment in a project will require proof of sales however a project that is large or complex enough may require external funding prior to such proof, leaving the project subject to stress testing and seemingly a total failure.
Formless PotentialPerspective2020

Without such a representational tool we cannot effectively communicate the most meaningful purpose for algorithms in design.
We can use representational techniques to draw a house, an animal, or even bacteria in many different ways. For example, we can compose a symbolic cartoon, a technical section, a descriptive portrait, a three-dimensional model, or even an animated video. In each case, the subject matter is revealed in a new light and our understanding of its form is affected.

However, these varied approaches only describe the subject matter in an abstrated reality as a frame of reference. For example, a cartoon does not provide detail, a technical section does not show the elevation, a portrait does not show the reverse view, a three-dimensional model does not reveal movement, and an animation takes place only from a particular point of view. As observers, we are also unable to digest multiple representational techniques at the same time. This leads to architectural drawing sets that describe the whole through the composition of many isolated subsets of the subject matter.

While this strategy functions well for its intended purpose, it provides no solution when attempting to describe design logic prior to a final form. For example, how do we represent the design range of an algorithm that can product infinite variation between particular bounds? Describing the section view, or any other format, of such a design range would only appear as a fog of overlapping content due to the pluratlity of potential results. Yet showing only a single instance of the algorithm range is hardly representational of its potential.

This problem is compounded by the multi-faceted value of algorithms as they can simultaneously create orthographic views, three-dimensional models, etc. of a design with balanced energy systems, envelope layers, code limits according to the specific input variables. Drawing this would be the equivalent of showing the simultaneous progression of each cell in a fertilized egg as they replicate to reach their final destination in an adult body. Unfortuantely, without such a representational tool we cannot effectively communicate the most meaningful purpose for algorithms in design.
Geometrical DensityPerspective2020

More sophisticated shapes seem to conceal their rationale due to the multivariate sources and evolutionary tuning.
Buckminster Fuller understood that a strong and lightweight geodesic dome could be built with consistent elements. He provided a clear rationale and method to reach similar results. In contrast, parametric design tools have enabled the design of elaborate forms often independent of clear design pressures making their comparative evaluation difficult. Consequently, we are often the recipient of needlessly complex forms in architecture predominantly due to their visual appeal. Therefore we need a metric for complex forms.

If we evaluate a spectrum of designs we do not control between the atomic and planetary scales we can identify a geometrical pattern. Geometry undulates from spherical to highly differentiated and back to spherical following closely the inanimate-animate-inanimate categorization. For this spectrum to repeat a form at another scale, a pressure to derive the form must be very similar at both scales and this pressure must be a dominant force. Since electromagnetic forces and gravitational pull are both omnidirectional pressures, this spherical result at both ends of the spectrum is consistent with Fuller’s simple correlation between spheres and uniform strength.

Platonic solids similarly resolve very fundamental omnidirectional pressures evenly but contain a rationale for an exact number of faces. For instance, the electron valence shell of a Carbon atom contains four electrons and consequently their inherent magnetic repelling effect produces the tetrahedron. These simpler geometries are immune to evolution however, as we progress to the middle of the design spectrum, more sophisticated shapes seem to conceal their rationale due to the multivariate sources and evolutionary tuning. Nevertheless, a peak in differentiation can be seen in the middle with animate organisms that contain a number of systems (skeletal, nervous, integumentary...) linking together many scales (cell, tissue, organ, body). Therefore, while form may be weightless it can still correspond to an input density.

Architecture consequently operates at a lower differentiation than animate organisms and should similarly lack some differentiation relative to the peak as buildings will never respond to as many geometrical pressures. Due to the power of nascent parametric technology, we generate overly complex designs relative to the number of input pressures and this upper threshold should be monitored in a general sense. We also need to begin compiling a lengthy list of input forces that all buildings could potentially accommodate in order to comparatively evaluate a building’s geometrical density in detail without an evolutionary process in place. We can then share the absence or presence of a design consideration for each list element and more clearly interpret the value of complex designs.
Pinpoint CoordinationPerspective2019

When comparing two designs, depending on the number of variables we evaluate, ideal coordination can appear to be different.
Coordination in design can be conceptually described using overlapping circles. Within an arbitrary field of view, a certain number of interactive perspectives can be evaluated . For example, we might consider the objectives of five separate perspectives (eg. circulation, envelope, structure, code, and budget) and outline where they have mutually beneficial goals (eg. double loaded circulation at the core permits shorter distances for fire exit code). Certain objectives could be appropriate to only one or possibly many perspectives and the goal would be to identify design decisions with the most mutually beneficial perspectives included.

In the first example below with five perspectives we can identify 3 zones of highest coordination, sharing a maximum of 3 perspectives, and with a relatively large area of overlap. If we kept these same perspectives and added another three (eg. waste, aesthetics, and construction time) we would only identify 2 zones of highest coordination, sharing a maximum of 6 perspectives, and with smaller areas of overlap.

In reality, all of the numerous variables are present whether we choose to evaluate the layers and create a highly coordinated result, or ignore them. For instance, whether we choose to measure the carbon score of a project or not, a measurement exists. When comparing two designs, depending on the number of variables we evaluate, ideal coordination can appear to be different. In the third diagram the lightly-shaded pink regions identify choices that would have had a higher mutually beneficial result than the best of option one, yet were not identified in the evaluation due to the limited number of perspectives. Also, while it may be more strenuous to identify the tiny area with the greatest number of overlapping perspectives, this choice includes benefits for the largest number of perspectives so it also represents the highest potential for repetition.
Growthless ImitationPerspective2019

Each plant will develop various discernable differences in form during growth with specific adaptations to its setting.
In order to define appropriate building applications for a genetic design approach, we can use the analogy of a plant proceeding through three predominantly visible stages of growth differentiation. First, seeds will sprout into a form that is predetermined, significantly resembling other plants at the same stage. Second, each plant will develop various discernable differences in form during growth with specific adaptations to its setting. Third, the build-up of variation over an extended period includes idiosyncratic scarring and the adaptations mature into a very specific form where the likelylood of imitation is basically null.

While a plant will grow through each of these stages successively, we currently build buildings by jumping directly to a fixed and final stage. Therefore our first step to imitate a plant’s ability for adaptation in genetic design is to select a range of variation in form, bracketed around a single stage, and design for it. In other words, the genetic design would be capable of outputing many unique buildings at a particular stage, rather than show the growth of a single building accross stages.

In order to focus on the most practical next steps for coding a genetic design, we should imitate the middle stage for genetic buildings. At this stage we have enough variation between buildings that it merits the effort to design the genetic code and the number of environmental factors are also more digestible. If we imitated the first stage, designs would have minimal expressed variation and the genetic code would have no merit for development as we could just use the same building plans over and over. At the latter stage, the plant differentiations represent such unique adaptations to a setting over a prolonged period of time that our control over such a genetic code is still far too cumbersome at the outset. For instance, it is more likely that we will want to use a genetic code to propogate solutions for simpler multistory residential buildings than it would be to build multiple burj khalifas.
Danielson Architecture Office

Industrial   Residential   Public   Algorithm   Material   Philosophy   About


Privacy Policy
© 2025 D A O
All rights reserved