Some people might tell it that the great grim motto of the STEM (Science-Technology-Engineering-Math) field is: sweat, tears, and pointless manual labor. So here’s just a small update on how things are for the Autodesk Robot Structural Analysis (ARSA) users out there.
The last week brought some new discoveries of problems that a Robot user should be aware of. I’ll just leave it here for future reference:
- While ARSA is able to export 3D faces from DWG/DXF files, the export process is prohibitively slow, taking some 2 minutes per single 3Dface. This adds up to the horde of problems with structural calculations compatibility, which can be summarized as follows: if you ever switch to another structural analysis software (or sometimes even another preprocessing technique), you will have to rebuild your existing models nearly from the beginning in order to be able to use them.
- The issue with response spectra analysis (namely, incorrect ZPA calculation) that has been plaguing dynamic analysis in ARSA at least since release 2011 (and possibly in earlier releases) is still there, after a major version update and three service packs. Serioualy, how do you get away with this crap? Am I the first person to ever use ARSA for dynamics? I don’t think so, since there are guys that are much more proficient with dynamics in ARSA than me.
These issues were discussed with the support team of ARSA (these guys do a really nice job, considering the negligence that they receive from our mighty ADSK overlords) and it was established that these issues are persistent and endemic to the ARSA application.
This September, Autodesk (namely, its experimental development division called Autodesk Labs) presented a new cloud-based tool for the structural analysis discipline, called Project Storm.
Project Storm is described as a web-based service application that plugs into Revit Structure, Autodesk’s flagship structural modeling package, and implements sending the software-generated model for analysis on a remote server. Results then can be viewed both via web report and also imported directly into the Revit model.
This is, by now, as close as we can get to cloud computing in structural engineering. Any materials available online related to Project Storm are limited to this video where the tool’s features are described. It couldn’t go without some BIM hype on “informed decision making earlier in the process”. The usual “decision making” that FE analysis is able to facilitate is plain and simple: specifying the structure’s stressed state in order to do code checks and other postprocessing.
In this video, a member of the Autodesk’s Polish team (the one that supports Autodesk’s structural analysis product, Robot), Tamas Fudala, explains apparent features of Project Storm, including the now seemingly mandatory hype lexicon on BIM workflow integration and “informed decision-making at early design stages”.
I am not aware of any “informed decision-making” that has anything to do with design model analysis, though. Structural design does.
Now a whole pack of Autodesk-affiliated or otherwise enthusiastic bloggers have joined the Storm development news, most of them working within BIM modeling. Not a very far cry from structural analysis (an instrument of which Project Storm is meant to be), but still.
Reviteer: Project Storm for Revit Structure
Let me take my own humble view at Project Storm.
Project Storm setup unpacks smoothly from a single executable archive pretty much like any other Autodesk software.
Since the product is, for reasons unknown, available only in selected countries, I had to use some workarounds to get the distributive.
2. Look and feel.
Storm installs into Revit Structure environment as an additional button in the Structural Analysis tab.
Storm middleware also hangs out in the system tray, entertaining the user with utterly meaningless popups.
Since no tracs of Storm installed files were found on hard drive, I got curious about its actual contents. Unpacking the MSI install file revealed the internal package structure like this:
All the interesting stuff is in the dynamic libraries folder.
What do we see here? Lots of ready-to-go modular interfaces that include Amazon (Cloud?) stuff, archiving and logging libraries and the Project Storm’s core: the Stratus cloud engine developed by Stepheson&Turner for Autodesk.
The Stratus engine interface is essentially the main part of Project Storm. How it is used anyway, since Storm is supposed to do structural analysis over a cloud service, and what is the back-end analysis engine? This is where things get funny and sad both at the same time.
3. Storm’s Internal Mechanics
I hit the Analysis button and this laconic form ensues. Options for analysis settings are, to put it short, appalling. The only interesting thing here is Robot Structural Analysis being mentioned. This is basically the only time that the Robot software was ever mentioned in any of the Project Storm promotion materials.
Every numeric structural analysis requires a tested and reliable solver as its middle-mount engine between pre-processing (putting up the structural model) and post-processing (deriving meaningful results from the stress-strain field and evaluating performance criteria). Good tools of these type vary from humble works of single individuals to proprietary monsters like NASTRAN or ANSYS.
Project Storm has nothing to do with any of these. As the analysis form has been shy to admit, the real software behind the Project Storm interface is another Autodesk product, Robot Structural Analysis (aka ARSA). Anyone with rudimentary ARSA experience will immediately recognize it behind this example Storm output, formatted in HTML:
Pictured: Results report that Autodesk Storm produces as it concludes the analysis. The image and analysis data is essentially a typical, re-formatted ARSA results report.
This actually concludes my analysis of Project Storm capabilities. Project Storm is nothing more than a web envelope for our good old ARSA package. It is basically the same “Robot link” that reviteers have already had for quite a long time but without the need to have Robot Structural Analysis installation handy, and without the need to know anything about the real structural analysis and its many peculiarities whatsoever. All it does is it makes use of the ready-made cloud interface to upload the structural model to an application server that has an instance of ARSA running under default settings. ARSA does all the real work and bounces the results back to be presented as a HTML report or plotted over the Revit model. No need to mention the actual engine in any of the news. Neat and tidy.
Of course it was fun to explore the largest AEC software vendor’s cloud plans using the example of Project Storm. But the software’s practical use is extremely tiny, to the point of no use at all. You may surely forfeit all hope to do anything with it that would even remotely be relevant to all the “cloud analysis” hype in videos, intros and announcements.
I was unable to make any use of Storm with the sample models that come packed with Revit Structure and Robot Structural Analysis. To feed these default, Autodesk-made models to Storm, some really disruptive editing had to be made that involved deleting whole parts of the model, rendering it practically useless, only able to demonstrate how the process is meant to work.
Analysis speed, to a surprise, isn’t looking any good compared to desktop. The Storm’s cloud web analysis is extremely slow, likely because the server would yield a tiny fraction of its resources to your particular task. And with Storm, we are not getting to do any dynamics tasks where all the real big resource spending is.
In other words, the cloud speed and resource claim in case of Project Storm is no more than a standard cloud computing mantra.
(Pictured: cloud calculations took around four minutes for this simple model, compared to fraction of a second using desktop. )
(Pictured: in case of Project Storm and cloud analysis, you won’t get any close to the immense amounts of information that are emdemic to structural analysis.)
Is put short.
Project Storm is, as it comes out, a technology demo only. It would be unwise to expect it to be able to do any real work.
And the Autodesk Labs site should have had the guts to admit that this is just a web port of the Revit-to-Robot link that would not allow to do any preprocessing and parameters settings of the structural analysis process, not to mention the “increased productivity” hype
On the other hand, the good thing is that this tools allows Revit users who do not possess any analysis experience to exploit the immense capabilities of a FEA solver engine. Oh wait… is it really a good thing, giving an immensely complex tool that is meant to solve strength, stability and structural performance issues, which are often issues of life and death, to untrained folks and make it seem like something easy? Perphaps I’d use the word “profanation” instead. Or “sacrilege”, depending on how hell-bent you are on structural engineering.
What about the further developments prospects, can we make any speculations about this?
Some guys from Poland (where Autodesk Robot Structural Analysis is developed and has been very popular) are celebrating ARSA’s success in getting into this Project Storm stuff, using words like “how tables have turned, it’s clear who is now the leader in Autodesk AEC”. I think they are experiencing an onset of dangerous furor. As you see, the “Analysis” button in Revit is now reserved not for ARSA, but for the cloud:
Autodesk’s cloud initiative is not promoting ARSA in any way. It is the other way round: using the cloud layer as virtualization tool, they will try to incorporate Robot functions into Revit. With Autodesk apparently increased cloud financing, the cloud computing will use ARSA capabilities without the need to promote ARSA brand.
In this scenario, the separate workplace for Autodesk Robot Structural Analysis package will have all the chances to be expelled from users’ desktops, simultaneously opening a way for unqualified Revit modelers and designers to use it without appropriate expertise.
Consequences for the structural analysis discipline are hardly positive.
In my search of ideas on how to present engineering portfolios, I frequently come upon random sites that claim they can do your CAD work. Most of them come from India, obviously, proving once again that our very own East Europe here is basically a black hole on the globalized world’s economic maps.
Now the funny thing is that these sweatshops pretend to be able to do 3D drafting work, samples of which are presented in their portfolios. Like these two.
Yep, portfolios. Two perfectly identical sets of images on the sites of seemingly non-related companies. Do they really think their clients are morons with five seconds visual memory span, unable to do basic search, filtering and comparison?
I think these ‘companies’ are tarred with the same brush as any other type of web fraud. I’d rather choose to leave this kind of work for real in-house engineers and designers and leave the outsources with writing their lousy .NET code.
Sorry I didn’t post all the great stuff that happened. Was too consumed by Twitter, anyway. I will make a directory of topics here and will post some earlier English articles, that’s what’s planned. Also, met a lot of extraordinary gentlemen on Twitter and in the internet, mostly related to FEA. Will post about them today.
Of course, there’s been a lot of bear wrestling, dog hunting and other Russian language-related activities, but I am pleased to have at least one English pubulication. That’s a guest editorial at Ralph Grabowski’s upFront.eZine, a newsletter that illuminates the whole CAD industry. I didn’t know it was pending publishing so I was really harsh in my definitions there. Don’t regret it at all.
And my EERI membership certificate came today. EERI is one of the prominent seismic engineering societies out there. Okay, it’s just e-membership, but so precious to have anyway. Feels like a real Earthquake Engineer!
And the most intriguing part: due to collaboration with Ralph Grabowsky, a brand-new, two-worlds-collide full-length feature article has been published on Isicad.ru! For those of you who reads in Russian, that should be an interesting reading about BIM, its advancements and grievances.
For everyone who happens to read me, I’m back and rollin’. Lots of things happened in the course of this half-a-year, and a lot more planned. I’ll begin repenting for the prolonged silence by throwing in my older article on BIM published at isicad.net, an English version of a very popular CAD resource supported by the famous LEDAS company.
A More or Less Optimistic Update on BIM
The current state of thought in the AEC industry views Building Information Modeling (BIM) as pretty much established mainstream technology. Nevertheless, BIM has seen little implementation in the CIS market to date. Regionally-savvy CAD experts, academics and pundits have been heavily enthusiastic about BIM in the recent years but its route to an average workplace has been hindered by various reasons.
With this in mind, it is small wonder that the COFES-Russia seminar recently held in Moscow and its BIM workgroup reignited discussion over BIM, its very definition and prospects. After the COFES seminar had done its job introducing participants’ initial positions to each other, multiple publications emerged at isicad.ru, followed up by mostly very insightful comments. (See more details about the authors in David Levin’s blog post, where he also provides an even better introduction.)
With this summary, I will try to cover most articulated viewpoints in this highly enjoyable discussion, inevitably plunging into a somewhat biased depiction of alleged BIM problems and possible ideas of alternative solutions.
A Doubt to Begin with
The above mentioned discussions can be summarized as an interchange of insight among professionals from very different backgrounds and resulting formation of two contradicting viewpoints: one being that BIM in its current incarnation is a de-facto choice for everyone striving to be relevant in the future, and the other questioning the postulated BIM benefits and pointing out systemic weaknesses like disregard of prior practice and problems with existing projects.
The discussion evolved beyond the trivial exchange of “yes it can – no it can’t” as Alexander Yampolsky’s clearly skeptical feature argued that reality replication with BIM is by no means acceptable to the majority of projects and while throwing BIM techniques into the process may improve design quality, the single fully integrated model is unachievable.
In a spectacular backlash, Vladimir Talapov delivered a series of articles (1, 2, 3, 4) that comprised a true manifesto of a BIM enthusiast, elaborating history of the catchword and citing works of Frank Gehry as state-of-the-art examples of BIM success. Most of the praise given to BIM is justified and in line with well-known arguments like project coherence, bigger savings and the need to catch up to the growing needs of owners.
Yet, in its eulogy to the BIM paradigm, the series has thrown into discussion quite a few easily attackable statements. It postulated that the early concepts of the 70’s and the today’s vision of BIM are essentially the same thing, and included into BIM definition any technology that makes use of parametric technology, information linking and attachment of additional data to a model of any dimensions. When it went on to labeling BIM skeptics as luddites, the battle lines were drawn and the next feature authored by yours truly was predictably named “A Less Optimistic Viewpoint at BIM“. In it, an effort was made to push back the enthusiastic pressure by reiterating on BIM definition and describing reservations about industry-wide BIM embracement – as seen from the trenches – to the CAD cognoscenti.
A Less Optimistic Definition of BIM
A short analysis of (acceptably) recent publications shows that even among the English-speaking, early-adopter majority there is still much ambiguity about the meaning of BIM (for example, CAD management guru Robert Green views BIM as a superstructure over 3D design while showing awareness of this ambiguity), and that’s only more true for the AEC community in Eastern Europe.
Over the course of the current discussion, BIM proponents have characterized the approach as being fully integrated, internally linked, intellectually tied to geometry, allowing for painless editing, revising, interdisciplinary liaison, digital processing and manufacturing – basically all possible virtues of an integrated design process. There is, however, a much more specific definition of present-day BIM naturally following from market requirements and available functions of current software packages which employers and users recognize as BIM flagships.
In a nutshell, there are three things that are required from a workflow to qualify for BIM:
- geometrical model that allows for advanced shape and space control and collision detection, which almost inevitably limits us to 3D implementations;
- parametric objects with intellectual links and constraints between them;
- possibility to include any relevant information with the geometrical model for automated processing.
In a nutshell, the article asserts that all the benefits of BIM are derived from these three pillars. What isn’t derived is the possibility to turn BIM into an AEC-tailored Product Lifecycle Management (PLM) solution. BIM is essentially an equivalent of PDM, not PLM, and the newly invented acronym, BLM (Building Lifecycle Management) is unlikely to change this.
The reasons for this are the current inability of BIM to handle the enormous amount of highly formalized and regulated maintenance documentation that is usually associated with complex objects like, say, industrial structures. BIM requires the paperwork and existing processes to align to its needs, not otherwise – and the current building will repulse too big and too unavailing a change that BIM would bring into maintenance hurdles. In other words, the more heterogeneity there is in the information to be incorporated into BIM in a given project, the more BIM is likely to go out of the window on that project.
Moreover, while the undeniable benefits of BIM were discussed in detail in the series by Vladimir Talapov and in the discussion that followed, there are some less-obvious fallacies with BIM implementation that were brought into attention of the CIS AEC community over the course of discussion (some of them are naturally in line with general BIM criticism that one encounters in many sources), some of them being neglect for engineers’ needs, loss of established workflows, lock-in to a specific solution provider, trading versatility at the price of niche efficiency, and much larger effort required since more information means more detailing and additional work required to interlink it. (This is apparently the cause of the efforts to make BIM serve the object during service life since it increases the overall output per spent effort.)
This was, of course, not to question the already-proven BIM worthiness, but it came up to being more than enough to dilute the praise and get more realistic about BIM prospects with the CIS region.
Update on BIM: the Complexity Challenge
Now before the final conclusion given in the article is described, there is a really interesting issue to take into account. In a double shot, Vladimir Talapov and Vladimir Malukh of LEDAS delivered two independent yet strongly connected opinions about what mechanical CAD applications are able to do to the AEC industry.
Talapov’s article put a spotlight to the history of highly successful BIM pioneers Gehry Technologies and their Digital Project software (that being in essence the heavily modified MCAD juggernaut, CATIA), arguing that BIM would be the answer to challenges posed by complex architectural shapes similar to those used in the distinctive Gehry-tecture style.
Malukh in his viewpoint went on to describe how MCAD software makes it way to AEC and where it can be seen as a natural replacement for traditional solutions. He provided usage cases for each of the four major MCAD modeling engines and set up a list of causes for MCAD to go into AEC, that causes being:
- more advanced complex modeling tools when compared to niche products;
- parametric technology being a standard rather than a novelty;
- rich tools for design variations and versions control;
- native support for MEP needs;
- seamless integration with other machinery and equipment modeling;
- seamless (well, I’d rephrase this as less painful) transfer to structural analysis and back to the model;
- ready-to-go PDM/PLM infrastructure;
- prospects for construction technology to borrow more from MCAD manufacturing.
Summing up these two accounts on MCAD with regards to BIM and AEC, there is one clear pattern to observe. There are buildings and structures that fit well into this MCAD paradigm and, as Vladimir Talapov noted, may explicitly require 3D modeling tools and paperless, machine-driven manufacturing in order to be actually buildable. These are the projects possessing complex shapes and machine-like connections of elements.
In this case, the highly automated, MCAD-level craftsmanship is required and is responsible for these projects being better fit for PDM (and, by extension, MCAD tools) implementation. The point is, it’s actually a machine-like building to begin with. But what’s its connection to the BIM technology in question?
Where do we go with BIM
The described advent of MCAD into AEC essentially means rebuilding the BIM technology on an existing PDM and modeling platform. Yet the fact that BIM goes along well with MCAD-specific PDM tools does not mean that one just has to disguise PDM with architectural and structural engineering tools in order to repeat its success.
If we look at current BIM not as at modeling itself (that would basically be a 3D and parametrics talk) but as a process, we see that, in practice, it effectively boils down to an out-of-the-box solution implemented as seen by a specific vendor, driven by it’s view on the future rather than by users’ needs. This means that the BIM technology has to evolve into – or at least provide an alternative of – something more easily tailored to the existing needs and workflows of its users.
It is all too well illustrated by the example of Gehry Technologies who actually had to develop their own BIM system on their way to success. Of course, few are able to climb over to the opposite side of the CAD retail counter, and unique projects are, well, what unique is about: rare. To proliferate from the seeds of state-of-the-art skyscrapers and technological wonders to the average user, BIM must get smaller and modular. Here are some speculations provided in the initial article and augmented by the more recent opinions.
Not fitting everything into BIM workflow. Instead, fit BIM technology into a customized workflow. Single, integrated, all-embracing building data model is a myth and software vendors will have to deal with it sooner or later.
Providing a new level of data exchange. Communication is the key, and as far as CAD tools are in question, communication means automation. The lower-level automation still isn’t a resolved issue. Automatic links from Excel and MathCAD worksheets, Matlab calcs and FEA design models back to the BIM software could unite the existing established workflows with however advanced BIM deployments. (Not to mention that engineers are notoriously lazy and would be happy to pretend they are not evolving while in fact catching up with the progress anyway.)
Going modular. There has been a long tradition of software tools development by engineers for their own needs. To revive this practice, easily embeddable parametric or constraining engines, analytical geometry tools, 3D modeling tools able to populate the classic engineering analysis software would bring BIM to the majority of desktops sooner that one would think.
Here is a set of images that give a very rough idea of how a full-blown transient TH analysis is made in Autodesk Robot using spectrum-compatible accelerograms: http://www.ipernity.com/doc/197714/album/237239/show
The comments are in Russian, sorry for that.
If anyone is actually reading this, please surf to the Autodesk Imperial Headquarters and vote for the proposed Robot class for Autodesk University 2011. The people should build their brave new world themselves, comrades!
I will research possibilities to, I dunno, become a speaker on it, why not.