An ideal architectural expression is one that can balance the realms of art, technology, society, politics, environment, etc. and one can imagine a sphere within which there is a mixture of discrete needs from all of these realms being compressed together spherically. Our awareness must first expand out to a high resolution to include the breadth of requirements from each realm. Then we must compress these realms together proportionately and reach an ideal density where many needs can be met in a balanced way. However, an infinite depth of variables exists for us to compress within the sphere and a growing outer boundary also necessitates the orchestration of pressure on a larger outer surface. This process is further complicated as these variables from each realm have dynamic needs that change over time making it necessary to constantly review the application of each pressure point and maintain the spherical shape.
Metakinesis is a term we use to refer to the complexity of this compressive action. The delineation of the meta (outer edge) for each project and the intensity of the kinetic force applied to the meta (constantly evolving spherical content) determine the project density. Densities of these realms can range from blissfully disengaged to strenuously integrated and in the interest of comprehensiveness, we strive for the latter.
To design a balanced work, we need to identify, analyze and assimilate multiple perspectives into a cohesive whole, however it can be challenging to monitor multiple perspectives simultaneously or even sequentially to ensure such a result. An evolving list of dualities is a valuable tool, which we use to prompt a balanced design process. This binary list serves as a reminder for different perceptual forms in order to keep the design process dynamic and diverse.
The following essays are a work in progress and represent an accumulation of random clustered relationships in the philosophy of a metakinetic sphere.
A neural network is a three-dimensional structure where each interconnected neuron is a possible path of travel. Interestingly, trying to explain how to best navigate a neural network also provides insight into how these structures are useful tools.
If we pick a random start and end point within the network, many paths connect the two points together as an open ended solution. However if we include additional criteria, the constrained path sets become recognizably unique. For example, we can differentiate them with rules such as the paths must - include defined points, be shorter than a certain distance, avoid certain neurons, be restricted to certain diameters, include a certain number of turns, etc. Once criteria is assigned, these network paths can be interpreted and compared. This process of selection is analogous to decision making in the everyday world such as how we eat a meal. We can critique the choices of quantifiable neural networks with reasonable intelligibility.
The complexity of a neural network problem can also increase exponentially by including multiple rules and as the two points diverge to include additional paths. This complexity can even extend beyond the currently available paths to include future considerations where not all the paths can be definitively described yet. For example, most complex problems also have to take into account outside, uncontrollable forces so some rules must estimate probabilities for paths not yet visible. With multiple competing probability rules, we also have to rank their importance. This approximates the conditions of playing a game of go where an opponent’s moves must be contemplated and where strategy is difficult to critique while underway. Problems are less quantifiable and rely on qualitative interpretations of patterns. While this is more challenging, there are still clear limits to the variables at play since the game has rules and a clear method to determine a winner.
The deepest neural network problems are multifaceted global problems where we are trying to link together multiple qualitative neural networks into a fog of consequence. It requires a simultaneous understanding of multiple scales (forest, tree, leaf) while observing from multiple points of view (forward, beneath, within) and planning for systems of growth (embryo, child, adult). At this scale, it is impossible to even describe the rules of the game let alone the method to determine a winner because we are aiming for a network of improvement. Both the delineation of the solution AND the problem are neural networks and the solution changes according to the constant change in the definition of the problem. These neural networks are impossible without computational tools, if possible at all, as we can only interpret issues sequentially ourselves. Therefore, it is and always will be impossible for us to control the process without being completely dependent on computational tools to plan and reason. At this deepest level, the best we could hope for is to create a system with immeasurable power and good intentions.
A reciprocal structure is binary in strength but interdependent in assembly. It cannot support any weight until all members are in place but once successfully prepared, the structure is extremely strong. These structures are prepared the way you would fold the flaps of a cardboard box together (to avoid the need for taped joints) - incrementally folding the sides together or working on all sides simultaneously until they are woven together. If one side is accelerated, the remaining sides are strained or cannot fit together..
The preparation of a genetic design is also reciprocal. For example, the tailorability of the Formid product to human anatomy permits the flattening of data acquired from the hardware (all seats are comparably moving) so that the machine learning algorithm can compare data sets accross a wide variety of anatomical idiosyncracies with ease. As an interdependent design, none of these systems can be prepared without the other, and their interdependency necessitates the progress of the work to be simultaneous prior to a demonstration ready product.
Unfortunately, the evaluation of such a system while it is being designed is not as impressive unless you understand the process. Stress testing along the way would only show a complete failure of the system. For example, capital investment in a project will require proof of sales however a project that is large or complex enough may require external funding prior to such proof, leaving the project subject to stress testing and seemingly a total failure.
We can use representational techniques to draw a house, an animal, or even bacteria in many different ways. For example, we can compose a symbolic cartoon, a technical section, a descriptive portrait, a three-dimensional model, or even an animated video. In each case, the subject matter is revealed in a new light and our understanding of its form is affected.
However, these varied approaches only describe the subject matter in an abstrated reality as a frame of reference. For example, a cartoon does not provide detail, a technical section does not show the elevation, a portrait does not show the reverse view, a three-dimensional model does not reveal movement, and an animation takes place only from a particular point of view. As observers, we are also unable to digest multiple representational techniques at the same time. This leads to architectural drawing sets that describe the whole through the composition of many isolated subsets of the subject matter.
While this strategy functions well for its intended purpose, it provides no solution when attempting to describe design logic prior to a final form. For example, how do we represent the design range of an algorithm that can product infinite variation between particular bounds? Describing the section view, or any other format, of such a design range would only appear as a fog of overlapping content due to the pluratlity of potential results. Yet showing only a single instance of the algorithm range is hardly representational of its potential.
This problem is compounded by the multi-faceted value of algorithms as they can simultaneously create orthographic views, three-dimensional models, etc. of a design with balanced energy systems, envelope layers, code limits according to the specific input variables. Drawing this would be the equivalent of showing the simultaneous progression of each cell in a fertilized egg as they replicate to reach their final destination in an adult body. Unfortuantely, without such a representational tool we cannot effectively communicate the most meaningful purpose for algorithms in design.
Buckminster Fuller understood that a strong and lightweight geodesic dome could be built with consistent elements. He provided a clear rationale and method to reach similar results. In contrast, parametric design tools have enabled the design of elaborate forms often independent of clear design pressures making their comparative evaluation difficult. Consequently, we are often the recipient of needlessly complex forms in architecture predominantly due to their visual appeal. Therefore we need a metric for complex forms.
If we evaluate a spectrum of designs we do not control between the atomic and planetary scales we can identify a geometrical pattern. Geometry undulates from spherical to highly differentiated and back to spherical following closely the inanimate-animate-inanimate categorization. For this spectrum to repeat a form at another scale, a pressure to derive the form must be very similar at both scales and this pressure must be a dominant force. Since electromagnetic forces and gravitational pull are both omnidirectional pressures, this spherical result at both ends of the spectrum is consistent with Fuller’s simple correlation between spheres and uniform strength.
Platonic solids similarly resolve very fundamental omnidirectional pressures evenly but contain a rationale for an exact number of faces. For instance, the electron valence shell of a Carbon atom contains four electrons and consequently their inherent magnetic repelling effect produces the tetrahedron. These simpler geometries are immune to evolution however, as we progress to the middle of the design spectrum, more sophisticated shapes seem to conceal their rationale due to the multivariate sources and evolutionary tuning. Nevertheless, a peak in differentiation can be seen in the middle with animate organisms that contain a number of systems (skeletal, nervous, integumentary...) linking together many scales (cell, tissue, organ, body). Therefore, while form may be weightless it can still correspond to an input density.
Architecture consequently operates at a lower differentiation than animate organisms and should similarly lack some differentiation relative to the peak as buildings will never respond to as many geometrical pressures. Due to the power of nascent parametric technology, we generate overly complex designs relative to the number of input pressures and this upper threshold should be monitored in a general sense. We also need to begin compiling a lengthy list of input forces that all buildings could potentially accommodate in order to comparatively evaluate a building’s geometrical density in detail without an evolutionary process in place. We can then share the absence or presence of a design consideration for each list element and more clearly interpret the value of complex designs.
Coordination in design can be conceptually described using overlapping circles. Within an arbitrary field of view, a certain number of interactive perspectives can be evaluated . For example, we might consider the objectives of five separate perspectives (eg. circulation, envelope, structure, code, and budget) and outline where they have mutually beneficial goals (eg. double loaded circulation at the core permits shorter distances for fire exit code). Certain objectives could be appropriate to only one or possibly many perspectives and the goal would be to identify design decisions with the most mutually beneficial perspectives included.
In the first example below with five perspectives we can identify 3 zones of highest coordination, sharing a maximum of 3 perspectives, and with a relatively large area of overlap. If we kept these same perspectives and added another three (eg. waste, aesthetics, and construction time) we would only identify 2 zones of highest coordination, sharing a maximum of 6 perspectives, and with smaller areas of overlap.
In reality, all of the numerous variables are present whether we choose to evaluate the layers and create a highly coordinated result, or ignore them. For instance, whether we choose to measure the carbon score of a project or not, a measurement exists. When comparing two designs, depending on the number of variables we evaluate, ideal coordination can appear to be different. In the third diagram the lightly-shaded pink regions identify choices that would have had a higher mutually beneficial result than the best of option one, yet were not identified in the evaluation due to the limited number of perspectives. Also, while it may be more strenuous to identify the tiny area with the greatest number of overlapping perspectives, this choice includes benefits for the largest number of perspectives so it also represents the highest potential for repetition.
In order to define appropriate building applications for a genetic design approach, we can use the analogy of a plant proceeding through three predominantly visible stages of growth differentiation. First, seeds will sprout into a form that is predetermined, significantly resembling other plants at the same stage. Second, each plant will develop various discernable differences in form during growth with specific adaptations to its setting. Third, the build-up of variation over an extended period includes idiosyncratic scarring and the adaptations mature into a very specific form where the likelylood of imitation is basically null.
While a plant will grow through each of these stages successively, we currently build buildings by jumping directly to a fixed and final stage. Therefore our first step to imitate a plant’s ability for adaptation in genetic design is to select a range of variation in form, bracketed around a single stage, and design for it. In other words, the genetic design would be capable of outputing many unique buildings at a particular stage, rather than show the growth of a single building accross stages.
In order to focus on the most practical next steps for coding a genetic design, we should imitate the middle stage for genetic buildings. At this stage we have enough variation between buildings that it merits the effort to design the genetic code and the number of environmental factors are also more digestible. If we imitated the first stage, designs would have minimal expressed variation and the genetic code would have no merit for development as we could just use the same building plans over and over. At the latter stage, the plant differentiations represent such unique adaptations to a setting over a prolonged period of time that our control over such a genetic code is still far too cumbersome at the outset. For instance, it is more likely that we will want to use a genetic code to propogate solutions for simpler multistory residential buildings than it would be to build multiple burj khalifas.
Identical genetic coding can be flexibly applied as illustrated by the redistribution in relative size and position of different species’ bones such as the humerus, radius, ulna, metacarpals, and phalanges. This is due to the implicitization of nesting equations in parametric design sequentially. The homology of the fundamental division of the bone structure into five groups remains fixed between these species while their subsequent differences are expressed through growth controlled by their unique toolkits to express specific variations. This method of design where control over multiple “species” is achieved while reusing the same genetic code is how innovation in architecture should presently take place. An important caveat in this description is that all bodily systems must be designed for simultaneously by the governing algorithm as major changes will need to be considered to finalize the fixed rules while merging all necessary systems together into a single compatible set of genetic code.
With this design code, the generation of organisms capable of expressing differences equivalent to species with completely different modes of travel will be possible with relative ease. Another important caveat in undertaking the process of orchestrating this genetic system is that a specifically intended final application for the genetic code is necessary in order to rigorously restrict the code according to fixed rules that coordinate well with the process of fabrication or growth. This approach will also test the combination of all parameters as a whole for their capacity to survive. Individual systems tested separately are not really a good measure for an organisms capability to survive as a whole solution can be significantly compromised by one system. Therefore the objective is to define the collective set of key architectural genes in order to form the basis for a plethora of built work around the world.
Similar to the race for artificial intelligence, architecture will soon be based on the most competent parametric designs with vastly integrated advantageous qualities built into the design framework that can negotiate the widest range of outcomes with the most competent fixed rules. In this illustrated example, we have the same fixed rules for bone organization controlling the vast differences between modes of transportation including terrestrial, airborne, and aquatic abilities. These differences are calibrated to many other factors such as changes in bone density, weight of the body, speed of growth, scale of the body, etc. Selective pressures for the building industry that will influence the prioritization of algorithms will be many including cost, spatial efficiency, ease of construction, time to construct, durability, reparability, replicability, transportability, aesthetic appeal, simplicity, updatability for future systems, and sustainability. The weighted value of each of these variables will change over time, with ease as they are all algorithmically integrated. This will result in a constant, slight modification of the design occurring endlessly in the pursuit of the latest measure of efficiency and will unravel categorizations by release date such as version 1, 2, 3 in favour of speciation as just a pure continuum.
We step out into a dark trailhead and a cool evening breeze to start our climb after the busy season. Mountain huts are inhospitably closed, we are traveling with minimal food and clothing up into the thin air, and nearly alone along the foreign trail. We reach a frigid and exposed windy summit with a rising morning sun surfacing above the clouds roughly 4km above sea level. We take a moment to watch the clouds turbulently roll past us eventually being displaced by a calm sun-soaked blanket. We then descend an endless knee-dislocating stone slope to catch a generous lift from a passing local truck driver back into town.
Our picturesque experience at the summit was rather brief relative to the duration of effort required to ascend and descend. However the trial has had an enduring value. In chemistry, a high pressure environment can over time increase an element's subsequent hardness and capacity to withstand abrasion from other elements. Similarly the exposure to difficult or stressful human experiences can be beneficial for enduring future interactions without necessarily knowing the exact challenges ahead. New boundaries can be set for our everyday activity based on the strengthening of our limits from difficult self-induced experiences.
While such durability to our human character can be desirable, navigating this subject matter is challenging when considering the workplace relative to compensation. Once someone completes their education, how should they be expected to withstand hardship since it can be pursued personally and also offered in service? Some employers and clients pay generously, while others can go as far as to expect payment for the opportunity. Furthermore, a workplace can eagerly pursue challenging unfamiliar work with opportunities for self-development or remain laid back with a steady return on repetitive work. In architecture, quality output depends heavily on a collective pursuit of solving difficult problems and there exists infinite room to develop our work in detail. The desire to pursue hardship inevitably necessitates a careful matching between the difficulty of the task and human tolerances. Finite resources are available for any project so to achieve such results, compensation also inevitably becomes disassociated from effort.
Specialized work and commensurate consumption have been a priority in the world but this has disillusioned us into a social ecosystem stuck in pursuits to become the most differentiated and unique. This is risky when it ranks above the need for collective human and environmental health. This root drive makes it difficult for us to relinquish, for instance, our fossil fuel dependency as it would truncate our individual competitive capacity to stand out in performance. We desperately require a tool to gauge our collective needs according to the conditions around us so that we can set a lasting homeostasis as our pinnacle achievement.
As a society we have struggled to identify the necessary conditions for a shared quality of life but it can be difficult to see beyond the complexity of our man-made environment. One point of departure might be to develop the criteria to endure isolation whilst maintaining good health in a more natural environment. Remote living conditions expose us to the elements and offer limited support for our specialized tendencies. If we run out of food, our abundance of wood or art will not nourish us when we cannot trade our strengths to nourish our weaknesses. In isolation, we must risk our health on the basis of our competencies in every category necessary for survival and quality of life simultaneously. Remote conditions can immediately focus our attention on maintaining a healthy proportion of values and skills necessary for quality of life.
From such fundamental criteria for an individual we can build in adjustment factors for a collective without losing an understanding for the overall balance. Efficiencies are gained through shared responsibilities within a community and through our technological skills. These efficiencies can support added lifestyle benefits without added drain on the collective health. Community efficiencies spanning great distances and conversions in value through monetary transactions will need to carefully tied back to root impact. Our individual definition can also be tweaked to accommodate variations in context from country to country leading to a fundamentally similar but also more diverse and sustainable end result.
Inhabitation logic should be nested so that we are solving for an individual, a city, and our spherical home dependently. This way, we can filter our actions to align with a healthy and sustainable homeostasis that is both recognizable to individuals and shared. On such a foundation, it is possible that a place of overpopulation could be reclassified as a healthy network. After all, a dense forest is no less in tune with its surroundings than a single tree.
When contemplating our design ability as a species, it is informative to measure the length of time we allocate to a design solution. A residential project will typically take a month to a few years to outline, larger public projects can take years to decades, and in some more unique instances we have managed to commit over a century to a project. However, despite these lengthy commitments, they are immeasurably short relative to a persistently evolving plant that has developed for millennia. Our buildings do not grow, are not edible, and do not erect themselves. If we were to leave the planet unattended, our buildings would vanish under the propagation of photosynthesizing construction.
Plant qualities are not undesirable in architecture, they just seem unreasonable as a goal given our current attention span. To achieve a deeper, more meaningful result, we need to elongate our design cycle and this requires a commitment to layers of maintained organization that can be passed on to subsequent generations. The implications of updating one variable in the design must be apparent to the whole. This requires both a sophisticated comprehension and representation of coordination so that the intelligence of the design does not succumb to short-term goals. At first, such a project would seem ordinary as we would begin the process much like any other design but eventually as it evolved, the focused effort would cultivate a design that appeared unfamiliar. Beyond that it would become unmatchable without a similar investment of time.
We are programmed to have two arms, twenty digits, sensitive fingertips, colour vision and a finely tuned central nervous system. Despite this prescribed layout, considerable biological variation exists between people as our DNA will permit variation of certain characteristics while others remain relatively fixed. For example, skin tone, muscle density and lung capacity may vary but people cannot photosynthesize radiant energy with their skin or breathe through their arms.
Nature relies on DNA as a fundamental tool to store critical rules of construction and organize methods to permit variation for tailored results. As a result, depending on your genealogy, social pressures, climatic experiences, etc. your body could be tuned to manage a densely populated arid setting more effectively than a wild, unpopulated temperate region – without neglecting the fundamental rules to include one brain, one heart, and a pair of lungs. For obvious reasons, the risk of inhaling sand cannot justify the deletion of an oral apparatus necessary to ingest food. Due to this delineation of critical rules and flexible rules, we can evolve a resiliency for a specific habitation over many generations without dismantling our entire cellular logic.
Architecture on the other hand has become exponentially more complicated as the volume of information we can assess exponentially grows. Now, the resolution of each project depends on a greater number professionals who are expected to coordinate libraries of information. Each new project is also ambitiously unique meaning previous strategies are constantly re-researched and re-developed.
If we instead start to compile architectural DNA as a semi-flexible logic, we can build-in a superior design intelligence and reuse it to propagate the environmental, social and technological benefits inherent within. This strategy is now possible with parametric tools. Foundational criteria from the Living Building Challenge or the Building Code can be established as critical rules that are automatically followed and subsequently set aside. We can then make every project unique as we focus on a flexible response to the variation in local topography, economy, program, client, etc.
As forests are populated by trees that have grown from seeds to synchronize with their precise location, we can populate a city by articulating different architectural seeds. Just as forest boundaries can be demarcated, architectural flora will naturally develop their own ideal regions and mark an evolution in our performance as designers.
Both feet are pulled off the floor even though his fist was confined to one inch of horizontal movement. Only precisely coordinated movements can build up the required momentum for this result under such a constraint.
We could all mimic each movement in isolation and generate a small force but the fluid connection of these forces from the ground up to the fingers is far superior and requires practice. Eventually this synchronicity becomes unconscious. Practice substitutes for thought and the muscles themselves recognize the appropriate timing for coordination. A practitioner can then consciously focus on observing his adversary and seek the appropriate time to trigger the build-up of unconscious momentum. Only practice and constraint nurture such unconscious coordination so that the mind can maintain focus on unpredictable surroundings and preserve the ability to act.
When will a building fully disintegrate? Will the envelope, electrical, mechanical and structural components all disintegrate together, in phases, or randomly? Since many of our building materials today are not environmentally neutral as waste, we do not want to tear down or abandon buildings due to partial failure while some materials are still working effectively.
Whether a building is cheaply or expensively built, synchronization of disintegration down to the screws and maintenance thresholds is critical. We can attain a mutually beneficial result when building failure is synchronized as capital investments in material are maximized and environmental draws are minimized. Due to rising material pressures, improved coordination in design can preserve our resources more accurately but we need to justify the investment of design time to coordinate these details beyond our typical practice of focusing on functional requirements. Such an investment of time can only be achieved today with the opportunity for repetition of a particular design.
It is important to reinforce a connection with industrial processes as knowledge of material origins can help us take responsibility for the associated material and energy waste. Until an understanding of a material's origin is acquired, the management of subsequent processes cannot be truly responsible.
Today we are not required to monitor this first stage since industrialization has eliminated the need for most of us to think about it. Fewer and fewer people have became responsible for understanding these first steps and our quality of life has appeared augmented as we only need to focus on material use.
Our generational dislocation from this beginning stage has cut us off from the most obvious source for learning about our dependence on balanced life-cycles. We don’t have to watch our food grow, experience drought, sleep according to the natural rhythm of daylight, or raise children for support in old age. Balanced processes do not form the foundation of our thoughts and it is easier to abuse a system that does not show the end user any consequences.
We easily forget that our current life status is dependent on such intense and unbalanced consumption and we must make the effort to connect the choices we make at the final stages with those at the beginning stages or else we will not recognize our skewed perspective and it will make us prone to collapse.
Historically, we moved towards construction with concrete and steel as they became more efficient structural systems. They consumed less space so we could accommodate dense urban centres where spatial efficiency is paramount. However there are zones in the transect between city centre and farmland that are not spatially constrained and where the efficiency of these materials is overkill.
Due to overuse of high energy materials, we now struggle to facilitate the use of materials such as rammed earth, straw-bale or other low energy options. Such construction now requires specialized labour and engineers, which makes it expensive to employ these options even though the material itself is cheap. Our adaptability to the transect has depreciated as we prioritize the expectation for high spatial efficiency everywhere.
The overuse of efficient materials has delegated many others to a "worse" category and this lets us relinquish more products into the waste stream. We need to continue to use a diversity of materials, even inefficiently according to our current standards, in order to maximize appropriate allocation. This will not only limit environmental damage by maximizing the use of existing materials but it will also limit the necessity to always invest further energy into new production.
Since the resilience of any lifestyle is tied closely to the range of elements that can sustain it, the efficient but limited industrial processes that we rely on produce a bottleneck that will eventually constrain our ability to maintain what we have produced. Greater efficiency of our current systems cannot replace the value of a diversity in the mediums that can support us.
We often do not anticipate the need for repair, yet the study of repair is an essential source for our knowledgeable use of products and it can only be learned by hand. For example, casting services into concrete limits a building’s lifespan as the structure cannot be altered to repair a service. This dilemma is exacerbated when we remove our tools and storage cabinets from our apartments since we are retracting our familiarity with repair. It also occurs at a larger scale when cities push industrial land to the outskirts. We have become accustomed to replacing our surroundings in their entirety but we must not forget the inevitable need for skilled labour during the lifespan of a building if we want it to last as long as possible.
For a long time, vernacular architecture has been respected as a sustainable strategy because of the ingenious use of local materials. This resourcefulness is based on the expectation that future failure will occur and repair is best corrected with materials immediately at hand. For example, the pristine quality of the structure in the Japanese garden of Kairaku-en has been sustained due to the adaptation of nearby resources and consequently it still looks like the original building.
Inventiveness used to be tied to survival but now globalization has made us dependent on distant resources and this weakens our ability to correct future failures as material access is constrained. In order to ensure architectural projects can last beyond our lifetime, we need to maintain an investment in local industrial processes and plan for failures.
Le Corbusier’s High Court in Chandigarh offers a range of articulated volumes that are beautifully graded and proportioned. The fin entrance, the undulating overhangs and the mosaic skin are all framed at different scales as though you could continue to appreciate this pattern through subsequent magnification. Evaluation over time instead of scale reveals an intricate formwork puzzle. While it announces itself as a construction feat upon completion, there is a simultaneous underlying question about reasonable resource use for formwork. While resource use might not have been the focus at the time of construction, it raises the question "how do we critique a building today?"
Much like the judicial system, buildings require a negotiation of ethical ideals as architecture embodies our values. It is not as simple as describing positive attributes about a building. Architecture is a very broad field and it is easy to speak positively about any project because there will always be positive qualities.
Environmental systems, personal goals and even aesthetic principles are misleading when we judge the effectiveness of one creation in contrast to another. Merit now needs to be recognized along with our limited resources and broader consequences. In order to preserve our livelihood it is just as important to identify what we did not build. For example, how do we recognize the value of those who hold back on their building volume? Since this is one of the most effective ways to make an environmental impact, it is problematic that it passes unnoticed.
This problem is not unique to Le Corbusier's High Court and is often not just under the architect's control. It is the responsibility of everyone involved. The choice to resist consumptive gain in favour of self-sacrifice for the sake of the environment is no reflex act, it requires regular practice but sustainable choices can only thrive when we first contrast our decisions to the absence of creation.
Interpretation is a technique. Both in art and in science, the skill relies on a mix of rational and non-rational thought. Interpretation can be quite challenging as any transmission of information through our minds will always carry a subjective quality. Verbal descriptions of facts, direct quotes, representations are all subject to this fate. Even if they were deemed purely objective at one point, as soon as we process them, we have made them fractionally subjective and personal. This is why over time, we modify our opinions because our own environmental conditions have changed and in turn have changed us. Nothing is static and definitive in interpretation.
Consider what it would be like if we were capable of conforming entirely to either option. First, a completely subjective understanding of our world would result in cryptic, mental isolation. We could not relate to any other individual since we would have no means to find common ground. With no common ground, how could we ever hope to achieve an understanding of creations with abstract meaning? On the other hand in a completely objective scenario, we have become completely homogeneous. There would be no need for theory, only formulas capable of processing variables of predetermined value leading to identical responses, just actions justified by a 1 or 0. With perfectly identical common ground, there is no need for choice, everything is already decided and life exists to reach death. However, in our real world, words on a page assembled to form specific meaning can have as dramatic an impact as to invoke or mitigate war based on their interpretation.
Our interactions have the opportunity for varied impacts when set in different contexts due to the mix of rational and non-rational expression. Therefore, there cannot be a pure conclusion, life is too complex and imprecision is also favourable to avoid stagnation. No single answer will suffice to explain any creation. It requires a healthy balance between consensus and individual opinion. Even statistical significance, although computed through strict mathematical terms still relies on the scientist’s belief in the limits of significance. One in twenty is now generally accepted as delimiting significance from insignificance but this value is determined without a purely rational method, it has developed from the consensus of scientific opinion. Therefore the determination of mechanistic, scientific knowledge still rests on personal and fallible conclusions. The tendency to seek purely rational and finalized answers seems so desirable today but abstraction undeniably remains.
GRAY - DIENT SPACE
A limited number of interpretations comprise the canonical perspective of Eileen Gray's E.1027 residence in Roquebrune Cap-Martin, France .
Gray neglected the pure objectivity of her time and strongly advocated the importance of going beyond objective aesthetic rules into something alive and dynamic. While it is not a precise technical marvel, her work offers a dense gradient of adaptable spatial conditions to accommodate a variety of environmental and social needs. The architectural flexibility of E.1027 leaves the occupant perpetually capable of adjusting to their satisfaction.
The surrounding landscape includes an exposed bathing area, a shaded terrace under the pilotis-raised building, a small tiled area amongst a variety of plants, a tree-shaded lower courtyard and a secluded North facing exterior dining area.
Indoors, the service core behaves as an interstitial space adjoined to the three cellular compartments. Access to each area can be navigated discreetly. The living room, master bedroom and guest bedroom can be occupied effectively without disruption from adjacent areas as they each have space to sleep, clean, work and escape to the outdoors. The living room offers a more public version of the same cellular components while the service core offers a more rudimentary variation.
Depending on the occupancy, the building can function as a home for reclusive neighbours or it can provide a medley of spatial and climatic experiences. The design is notable for its compact resolution of spatial multiplicity.