Laiserin's Lemma—I/O, I/O, it's off to work we go!
(lemma: a short theorem used in proving a larger theorem)
Jerry Laiserin

Now that engineers, architects, constructors and owners are poised to adopt collaborative design via data-intensive, 3D-and-more digital modeling, researchers are studying input/output (I/O) media and methods that may better support the new design tools. From laser scanning and gesture recognition to augmented reality and gamesmanship, the entry, manipulation and viewing/reviewing of design data will be radically transformed.

Before the discipline of interactive computer graphics got started in the early 1960's, designers who wished to use computers "simply" typed vector data into an alphanumeric terminal and waited a few hours to view pen-plotted output. Early "CAD" prototypes used light pens to "draw" directly on vectorscope displays. As CAD gained wider acceptance, hardware economics drove the adoption of raster displays and digitizer tablets or mice, albeit with extensive keyboard input still required in most commercial programs.

Where these historic interface techniques facilitated drawing-centric personal productivity software, they have now become stumbling blocks on the path to model-centric group collaboration software. A wide range of opportunities for improvement are currently being explored.

> Laser scanning, to capture existing building conditions or plant/process infrastructure in 3D "point clouds" that are directly translatable to or integratable with 3D digital design models. The leading commercial offerings in this mode of design input are Cyra and Quantapoint.

> Drawing-to-model conversion, to automate the transformation of existing 2D drawing data into 3D digital model form. The leading commercial offerings in this mode of design input are Plan2Model from Graphisoft and PlanTracer from Ideal—both of which rely on pattern recognition technology from Consistent Software of Irigny, France.

> Pen interfaces capable of tasks such as recognizing gestures (as input/edit commands) and/or inferring 3D massing from 2D outline sketches. Much of the theoretical/development work in this area has been accomplished by Mark Gross, Ellen Do and others at the Design Machine Group (DMG), University of Washington (USA), under self-explanatory project names such as Digital Clay, Digital Sandbox, Electronic Cocktail Napkin, Gesture Modeling, Sketch-VR and SpacePen. Commercial products reflecting just a fraction of DMG's concepts—although not necessarily derivative of DMG research—include @Last Software's Sketchup, Autodesk Architectural Studio and Nemetschek PlanDesign on D-Board. Newer tools developed specifically to leverage the capabilities of WindowsXP TabletPC Edition include Alias SketchBook Pro and Corel Grafigo.

> "Cybrid" digital/physical interfaces that move I/O out of the beige box and into the environment. Originally called "ubiquitous computing" (see Marc Weiser's 1991 Scientific American article, "The Computer for the 21st Century"), this trend now is variously identified as "pervasive" computing or "disappearing computers." Examples include: Bill Buxton's (of Alias) Portfolio Wall, which we reported on in IssueOne; the InterWall and InteracTable from Nemetschek and Wilkhahn, which we reported on in IssueNine; the "BlueSpace" joint-venture between IBM and Steelcase; and some fascinating work by Hans Gellerson and others at the Cooperative Systems Engineering Group (CSEG), Lancaster University (UK) that includes "Pin and Play" networkable wall surfaces, interactive "Ambient Displays" and the "Sensor Table" prototype of I/O furniture.

> As we noted in IssueOne, virtual reality (VR) and augmented reality (AR) systems support direct physical interaction with 3D digital models. Much of the pioneering research on VR for architectural and community design environments was conducted by Jim Davidson, Dace Campbell and others in the Community and Environmental Design and Simulation Laboratory (CEDeS Lab), jointly between the College of Architecture and Urban Planning and the Human Interface Technology Lab, University of Washington (USA). However, as we explained in IssueOne, there are important differences between immersive, virtual environments—whether entirely projected within head-mounted displays or viewed in a C.A.V.E.—and augmented reality, which superimposes digital objects on a goggled view of physical space. Steven Feiner's work at Columbia University's Computer Graphics and User Interfaces Laboratory, which we also reported on in IssueOne, digitally augments the user's mobile experience of real-life, full-size environments. In many design situations, the 1:1 scale of a mobile augmented reality system (or a C.A.V.E. VR system) may not be as informative or useful to a collaborative design team as would augmented reality with 3D digital scale models. The latter approach is under exploration by Alan Penn and others at the VR Centre for the Built Environment, University College London (UK) under the project name ARTHUR (Augmented Reality Round Table for Architecture and Urban Planning).

> Adapting low-cost, easy-to-use gaming interfaces over the web for real-time 3D interaction with design environments. Significant strides have already been made in this area of investigation under Paul Richens and Simon Ruffle at the Martin Centre CADLAB, Department of Architecture, University of Cambridge (UK), through the adaptation of the gaming engine for Quake II from id Software. Just as in the Quake game, multiple users can interact with each other (via avatars) and with the 3D digital model in real time over the internet in an easy-to-use browser interface. The initial work is now being extended from building-scale to campus-scale environments with technology from Virtools SA. At the same time, CADLAB is exploring a prototype "one click export to game" capability that uses CAD data in IFCXML format as the input, an IFC Model Server developed at VTT in Finland as middleware to a relational database (which we reported on in IssueTwelve), and the Quake II game as the output.

From laser scanned 3D point clouds for model data input to CAD2Game interactive web output there are more than enough new and emerging I/O capabilities to support the transition from 2D desktop to 3D collaboration by AEC, FM, plant/process and infrastructure designers and their clients. With the tools in place or soon on their way, the question remains—are the building professions ready to put their input into these new modes of output?

Let me know what you think.


Editor and Publisher, The LaiserinLetter
Analysis, Strategy and Opinion for Technology Leaders in Design Business

NOTE: Special thanks to Chris Russon, Brian Woodward and Nick Ansley of Informatix Software International Limited for their help in arranging my visit to the Martin Centre CADLAB in Cambridge, and to Paul Lethbridge and Hajni Michael of Graphisoft UK for facilitating my participation in the Technological Innovation in Design and Construction Conference of the British Institute of Architectural Technologists, where I encountered the work of the Cooperative Systems Engineering Group and the VR Centre for the Built Environment.
JL



< back