With its highly visual approach to building analysis and simulation, ECOTECT has enormous scope to be used as a teaching resource at any level in both architecture and engineering courses. This article aims to provide some guidance on how to introduce ECOTECT to students. It discusses some of the main problems sometimes encountered and offers a series of suggestions for avoiding them. Whilst it concentrates on coursework integration, you should also check out some of the examples of its demonstration potential.
Obviously as a hands-on experience for students it can either be a pleasure or a nightmare, depending on the project requirements and their level of proficiency. However, even if used solely by the lecturer, its potential for the interactive demonstration of quite complex concepts should not be overlooked.
The decision to introduce analysis software in any course is always a tough one. The main problem is invariably the amount of time required for students to gain sufficient proficiency to actually make use of the analysis results -- too often they struggle to get any results at all, let alone valid comparative ones. This can lead to disproportionate amounts of time spent on the mechanics of the process and not the contents of the course, resulting in a bad experience all round.
Fundamentally the software developer is a major part of the solution -- adequate tutorials, help files and background material can make the lecturer's job much easier. However no amount of on-line help is sufficient if no-one ever reads it -- and no-one has time to read when their project is due at 9:00am the next morning. Thus the nature and time provision of the project is also a significant factor.
A project in which students have to learn the software, model parts of their current design project and do thermal, acoustic and lighting studies -- all in six weeks -- will more than likely result in a great deal of frustration and very little work of any value. There are several reasons for this:
- Students are usually quite bright, know the system and are unlikely to start working on an unfamiliar tool very early -- it will take time and they know they could be more productive doing other things that they can do first. Better to wait until a few others gain some experience and benefit from that. Thus a delayed or slow start.
- If using their current designs, they may be still quite fluid and unresolved. No point simulating rooms that may not even end up in the final design, so they will likely wait until their designs resolve before starting anyway.
- Even if the students are well grounded in the fundamentals of lighting, acoustics and heat flow, just the nomenclature and approach used in the software can take time getting used to. Even an experienced user will have to think carefully and plan the model around the different requirements of each analysis process -- a first-time student has no hope.
From my own experience in running workshops and helping kick-start courses, the most difficult and time-consuming step in using any simulation software is the generation of the model. If you are learning to use the software whilst trying to create your model, it will take a great deal longer. This is the area that needs to be addressed first, and its usually where the whole process falls over.
There are a number of ways to avoid problems with this:
- Use Prepared Example Models
If the course is primarily focused on the analysis, there is really no need to waste time having students generate their own models. Either set up a series of relatively simple examples (whose complexity will not get in the way of understanding what is going on) or use one from the Tutorials directory. This way the students can concentrate on the interpretation of the results and the changes they can effect with simple modifications.
- Start Early
Instead of introducing the software at the same time as you intend the students using it for real, if your structure allows it, start out in an earlier unit. This could be as simple as introducing it as a demonstration tool that they can play with to investigate specific ideas/concepts. An example might be showing them how they can interactively drag the Sun around or choose different locations to understand solar geometry, a simple in-class exercise with no associated project work.
This creates familiarity with the interface and, with sufficient lead time, may even result in some of the students investigating it further on their own. It only takes one reasonably proficient and confident user to lift the skills of an entire class.
- Start Simple
If you start out with very simple analysis problems, it is much easier to focus on the relationships between the individual parameters of the problem. Thus for a Sun-penetration example, a simple square box with one window will set the scene. Once the students have worked out what they are looking for and how it changes with time, then they can jump to OpenGL view and start cutting sections through an atrium, etc.
- Calculate Often
If embarking on a more complex project, there is absolutely no point in the students waiting until they have fully completed their fabulously detailed geometric models before starting an analysis -- they will likely be hit with hundreds of potential error or warning messages that will take forever to track down.
However, if they constantly run quick analysis checks as they go, knowing full well that the results will be garbage, they can much more effectively keep track of modelling issues and errors as they arise. This way they know exactly what they have changed since the last successful run, so can deal with it immediately -- resulting in much greater confidence in both the model itself and the results generated from it.
There is nothing more soul-destroying than working for ages on a complex model and then watching the errors pile up when you first hit the calculate button.
- Focus on One Analysis
Every simulation tool has the indelible mark of its developer in the implementation and interfaces it provides to different processes. These can take some time to work out, even if you are well grounded in the fundamental principles on which they are based. Whilst the underlying premise of ECOTECT has always been that design decisions affect many different aspects of a building's performance, it can often be very difficult to effectively introduce several different analysis areas at once.
The author's own experience suggests that students respond better to things that they can actually achieve and potentially excel at. If the first few analysis tasks are relatively simple and comprehendable, many will voluntarily look deeper and begin their own investigations. If they turn out complex, confusing and inconclusive, that sets the scene for their entire outlook on this type of analysis.
Thus, if these first analysis attempts can be kept focused on one area (thermal, lighting, acoustic, etc), then it is easier to look deeper and feel some sort of accomplishment at having discovered and understood a more complex aspect of a building's behaviour.
The most successful integration of ECOTECT into undergraduate architectural coursework the author has experienced thus far was when it was staged over different levels. In first year, simple example models were used to introduce basic solar position and overshadowing concepts. At this level the students were taught to interact with the models by dragging the Sun and changing dates/times, progressing to some simple window manipulations. There was no formal submission requirement, it was just used for demonstration purposes so the students did not feel pressured whilst using it (the focus in the course was more on weather data analysis and deriving design information from climate).
In second year, more complex thermal concepts were introduced. This built on their solar position experience by beginning with incident solar radiation on surfaces and direct solar gains. The concept of thermal zones within a building is actually quite sophisticated and can take quite a while to properly understand. This meant progressing the students through single rooms, attached sun-spaces, multi-storey buildings and then to the virtual zoning of open-plan office spaces. At this stage several students were already using the software as part of their design work, even though it was not a requirement.
In third year came lighting and acoustics. In this particular instance it was felt that the students were capable enough to move straight to the use of RADIANCE for lighting simulations. This received a positive response from the students as they could immediately see applications for more realistic visualisations in their own work. The acoustics however was also kept relatively simple, focusing on geometric reflections and ray-tracing analysis -- once again to emphasise a playful and highly visual experience.
In the professional years of the degree, it was considered that ECOTECT and other analysis tools were simply a part of the design process and that students would have the skills to decide how or if they were to be used on their own.
Whilst architectural education in general provides a good grounding in building physics, in commercial practice much of this knowledge is quickly lost. There are many reasons for this, principle among them being the perception that architects can never really know everything about their buildings and will usually defer to specialists in each area anyway. Thus a detailed knowledge in any one area is not fundamentally important.
Simple economics dictate that it is not viable to involve specialist consultants in the day-to-day churn of the early design process. However for analysis and simulation to contribute at this early stage, the designers themselves must be able to formulate tests, carry them out and then utilise the results to make clear decisions. This is relatively straightforward in areas such as lighting, shading and sun penetration. However, confidence quickly falls away when it comes to thermal analysis, incident solar radiation and regulatory compliance. Even though it is possible to generate and test simplified models, extrapolating the results to a larger design context requires a much greater understanding of both limitations in the analysis method and the complex physical processes involved.
In order to achieve the step change in building performance required by both pending EU directives and UK legislation, such an understanding will be required if the architect is to retain control of the design process.
It is therefore important that current students be confident that, with the right tools, they can undertake this kind of work and use it effectively as part of their standard work flow. Sure an engineer will be needed to validate and sign off on the final design, however the hope is that the architect will have guided the process and asked the consultants all the right questions.
With students becoming increasing sophisticated and literate in their use of computers, and the profession moving more towards computational analysis as a fundamental part of the design process, the integration of simulation software into the core coursework of architecture and engineering students appears inevitable at some point. Obviously every course is different and every group of students will exhibit its own character and capabilities. However, early experiences in this area have shown that a managed approach and a gradual process of introduction are both very important.
Whilst we are continually developing new on-line resources and teaching material, computational simulation is still a complex subject to teach and involves a fair degree of intuition. Unfortunately a highly developed intuition only comes with familiarity and understanding, as well as the confidence to trust it. Thus, it is important to begin developing that confidence early.