There are currently no results that match your search, please try again.
This article discusses the benefits of digitising the planning system and the effect of digital twin technology on the built environment. It is based on our Built Environment Matters podcast featuring Jack Ricketts and Miranda Sharp.
While the fields of design and architecture have been pushing ahead with Modern Methods of Construction (MMC), BIM and 3D models, the planning process hasn’t actually changed much since the 1940s and remains largely paper-based. Thankfully, due to technological advancement, we now have the ability to modernise planning, and the emphasis on digitisation in the planning white paper provides high-level support. At Bryden Wood, our Creative Technologies team has been working alongside the London Borough of Southwark, the Centre for Digital Built Britain and 3D Repo to advance the issue.
The quest to digitise the planning system involves a complex ecosystem of different industry players coming together. Jack Ricketts, a planner at London Borough of Southwark, doesn’t want to see the planning process holding others back. While working on projects funded by the Ministry of Housing Communities and Local Government (MHCLG), Ricketts sought to avoid the possibility of duplicating work, or becoming an accidental blocker to the process. He reached out to industry expert Miranda Sharp for help in making a shift towards digital data. Sharp’s company, Metis Digital, works with a range of companies trying to connect technical assets and data to value. During her time working with the Centre for Digital Built Britain (CDBB) on the National Digital Twin Programme (NDTp), Sharp sought secure and resilient ways of connecting digital twins to deliver the common good, and looked for real examples of people trying to connect data in order to tackle cross-silo issues. The goal was to facilitate more efficient planning and operation, as well as to make data available to a wider ecosystem, including all of the people involved in critical infrastructure planning.
Sharp says CDBB knew the desired activity was possible theoretically, but needed a place where there was real demand to bring the information together. When Jack Ricketts contacted her about his desire to digitise planning, it seemed a perfect opportunity.
There are various stages of information involved in the creation of a new building, or an extension of a domestic dwelling. Architects produce one set of information to one set of criteria, submitting it to planners who then need another, and particular, set of information. From there it goes to the people who might approve the building, or the people constructing the building. In reality, though, all of these people just need slightly different slices of the same information. If everyone could agree and collect information to the same standards, sharing the same pieces of information upon creation, and when changed, it would unblock the system and lead to significantly greater efficiency.
While early BIM slides had a digital thread looping operational data all the way around, in fact there are all sorts of breakpoints. The handover from construction into operation never works particularly effectively, and we never really get that kind of handover into the capital model. Planning has always been one of the big digital breaks, where things suddenly go into quite a subjective, painful, and paper based process, and planning has long been held as a blocker to housing, amongst other things. The impact of digitising the planning process would be enormous, causing many other aspects to fall into place.
Still, digitising planning presents a complex and difficult challenge. Rickets says that while planning isn’t broken, it is slow. Once the designs of architects and engineers are submitted to the local planning authority, all of that design, modelling, information and data, is, in a sense, dumbed down. It’s turned back into 2D plans and some PDF documents. Much of the valuable information is lost because councils can’t consume the 3D designs and BIM models created by architects and engineers.
From this point, the information goes to the local planning authority. While they’re good at interpreting it, things can take a long time depending on the size of the development. Ricketts says he’s spent time with case officers who’ve spent two days with a calculator trying to work out daylight sunlight calculations and viability. He points out that these people didn’t go into planning to do those things. They went into planning to do the subjective work, and to do the planning.
While working on the Reducing Invalid Planning Applications project (RIPA), Ricketts began to map all of the legislation and planning policy, turning it into rules-based code. He wondered whether it could be used to map against BIM models, in order to extract all of the relevant information that planners need to assess and develop a decision. The aim would be to extract only the relevant pieces of information, out of the hundreds of thousands of bits of information in a BIM, leading to the question of how to present it for successful interpretation. This is where his second project, Back-office Planning Service (BoPS), comes in. BoPS would take the information and present it to the case officer. Ricketts says that although it all works in theory, in reality, we’re not even at the proof of concept stage yet. At the moment, there are conversations happening with partners and stakeholders about how best to achieve the goal.
The RIPA project started at the simple end of planning applications with permitted development, which might include something like an application for a slightly larger kitchen, or a loft extension. Permitted development is a technical assessment that provides yes or no answers based on aspects like allowable depths and heights. This makes codification easier. At the other end of the scale, work for a case study was also undertaken on an example development of 50 flats, mapping it against the common themes that run through planning applications. These included things such as names, addresses and locations, although specifics weren’t included as the work was still pre proof of concept.
Digitising planning is an emerging technology and way of working, and it’s hoped that learnings will be shared as things develop in different areas of the world. While we’ll never get rid of the subjective element of planning, and certainly not the need for people, digitising planning could help reduce administrative burden, freeing up time for planners to do the more valuable, judgement based planning work and enabling us to make better, and more holistic, sets of decisions.
Another benefit it’s hoped will arise from digitising the planning system is to make planning more accessible to members of the public. Being able to engage with and interpret a 3D model is much more helpful to people than being presented with unrealistic CGI imagery depicting permanent sunshine and few cars. Currently, trust in the planning system is at a low. If we can help people feel more in control of what they’re being consulted on, and give them a better sense of what a development will really look like, it should help to alleviate a lot of concern.
At present, ten Pathfinder projects are being undertaken by various local authorities. All of them are looking at how to digitise planning policy, and make it more machine readable. Local plans take years for councils to produce, and are based on evidence which is out of date almost as soon as it’s put into use, and definitely by the time the plan is published years later. Digitising the planning system will help us to start producing policies based on real time evidence. Policies could change over a very short time. Changes to market conditions, developer contributions, or the cost of land could impact the number, or location, of homes originally desired. Digitising planning would keep things much more reliable and up to date.
There would also be an opportunity to test more ideas before deciding which policy is the right one, and it’s hoped that by digitising the planning system we’ll also be able to do scenario testing. A brownfield site map could give a developer a much better sense of what they’re likely to get approved, enabling them to tweak their plans before they approach. This would thin out a lot of the existing legwork and administrative work, speeding everything up.
At Bryden Wood, our Creative Technologies team has found that in the process of developing configurators, simply putting the rigour behind what is or isn’t a rule, or what is or isn’t acceptable, is very helpful. Miranda Sharp says that she’s experienced exactly the same types of issues in the infrastructure space. She recalls a conversation with two linear infrastructure providers who wanted to establish a simple use case of information to be shared in order to better guard against, and mitigate, the effects of flood. However, with one provider interested in road safety and surface water, and the other in railways and embankment flooding, they quickly discovered that their differing concerns added significant complication to the task. A complex process of questioning ensued around the use and translation of nomenclature, which Sharp says was a useful process in itself. However, the fact remained that every time they opened a new conversation, a new dimension and further complexity was revealed.
Sharp says this is why the ideas behind the National Digital Twin are necessary, because although we have lots of very specialised and efficient systems for sharing information, in order to achieve the next level of efficiency and public benefit, we need to start sharing information between systems. While we’re very optimised to keep water off the roads, we don’t really understand other aspects, such as whether a drainage ditch should be built to go left or right at a particular junction. The information exists, we just need to find different ways of interpreting between it.
Sharp says that digitising planning is very much a use case of the National Digital Twin programme because people can see the value in it and therefore have the appetite to address it. She highlights that we’re currently at the very early stages of connecting digital twins and we need to choose projects which will move us forward, picking the low hanging fruit. She cautions that we should be careful not to codify or entrench any particular positions that will be preventative in the future. We must look at where people are finding problems and identifying common difficulties, and then seek solutions for how we can address them and apply those lessons in other spheres. Planning is an area ripe with value and opportunity, just as geospatial policy is another. If someone is digging a hole with a pickaxe and finds a pipe, it creates risk for all sorts of domains. We need to share information more efficiently in order to keep people safer at work, cause less disruption to the local economy by digging up roads, and provide better utility services.
That said, when we talk about digital twins, we might not always be referring to the same thing. To some people the term ‘digital twin’ implies a highly advanced, digital replication of an aircraft engine, while others imagine a real time digital model of an entire city. But that latter level of sophistication isn’t required, or even necessarily advisable. A digital twin of a city would be too expensive to store and use, and it would also be immediately out-of-date. We need to keep things simple as we work out how to achieve our goals, and the meaning of digital twin is simply a digital representation of a real thing.
Now, by real, we don’t necessarily mean physical. Sharp explains that the digital representation of a train timetable is just as important as the digital representation of a train itself, each providing important, but different, information. Further, when we use digital twin technology there should always be a two-way interaction between the digital twin and the real thing. The digital twin is used for simulation and experimentation, while the real-world thing provides feedback that can be used to affect the digital world. It’s possible that a digital twin could be real time, but it’s not a requirement. Sharp explains that thorough investigation of historic data is more important. In other words, do the lights always go down when it rains? Analysis of this type of pattern data is just as likely to help us achieve higher performing services as real time data.
As such, the National Digital Twin isn’t meant to be a one-to-one map of everything. The aim is to keep the data as close to the creator as possible, and to keep the creator in control. The National Digital Twin aims to create a translation mechanism called the Information Management Framework. This won’t be a massive data store sitting in the cloud. Instead, it’s a federated system allowing different parties to find and use data from other sources, and to check whether they’re allowed to do so.
However, certain problems remain. The first is that someone might not be sure whether the data they’re looking for even exists, and the second involves the requirements under which the data was gathered. It’s important to understand any inconsistencies there might be in the collection of data, because without understanding those types of quirks and failings, it’s hard to know how useful the data might be. Additionally, the further the data gets from its intended use, the less useful it is. This is why digitising planning makes quite a good use case because, in theory, it’s a linear process.
The next issue with data is governance, and the question of whether someone is allowed to share particular data. Here we see issues like cybersecurity and GDPR coming into play, presenting a list of complex and expensive regulations to navigate. Finally, there’s the issue of transacting data. While on the one hand someone might worry about opening themselves up to liability by providing data, they might also be wary of someone else using their data to achieve financial gain, and missing out on the opportunity themselves. Sharp says the situation has led to a nervousness and immaturity surrounding data, resulting in a tendency to either refuse to share data, or a lack of control for the end user. What we need, she says, is a much more mature set of transaction mechanisms. We need agreements, possibly establishing the rules of data use, or how any profits will be shared. This will be key to getting a really high functioning National Digital Twin, even with the National Data Strategy.
This is uncharted, complex territory. Security is a key issue, and security services are embedded within the programme. In fact, Sharp says security is the area she feels most confident about now, but she also points out that there have been some learnings to this end along the way. She cautions that we need to be careful about creating risk, which can happen even with good intentions. Ultimately though, we’re making quick progress with safety and security because it sets a whole lot of other things to zero. In the case of the Underground Asset Register, and to some extent the Construction Data Trust, there are certain types of data people are very willing to share, because, for example, everyone wants to help prevent deaths on construction sites. In these types of examples, governance and transactional issues are less of an issue. However, when trying to make the planning process flow better, or trying to design an integrated transport system, it’s harder to be certain about what gains will be achieved from connecting digital assets. Sharp says that type of understanding is still a long way off.
Jack Ricketts says it isn’t currently known how long digitising the planning process will take because of the financial investment required. However, there are plans for an application to Innovate UK and their Smart Grant funding programme. Of course, digitising the entire planning process is an enormous goal, and for efficiency’s sake, the process will have to begin with a single use case. Ricketts feels the best and most informative starting point is building safety, highlighting MHCLG’s external wall system survey as a good example of data collection and collation.
The MHCLG project asked local authorities to research and report back on the materials present in the external walls of all buildings above 18 metres within a particular borough. Although it sounds like a straightforward task, in reality it presented some unexpectedly complex questions and issues to surmount. What is a building? Is it the postal address? The UPRN? The single address with three blocks in it?
Next came the question of how best to approach obtaining the information. Ricketts says local authorities were using a combination of approaches, from trawling through records (building control, planning and housing), to looking at GIS mapping, and even cycling the borough. In the end, his team used 3D modelling to help them identify the buildings above 18 metres, but many of them turned out to be private buildings, making it necessary to link with Land Registry and Companies House in order to find and contact the owners to obtain the data. He adds that it’s difficult to make sense of so much information in a single Excel spreadsheet, and says it ended up multiplying exponentially.
Although Ricketts is keen to begin to harness the benefits and opportunities that will come from digitising the planning system, he agrees with Sharp’s observations about security and the possibility of accidentally sharing too much. He believes it will be much better to build in the necessary data security right from the beginning of the process, and says that will be a key next step to address. This will mean that applicants will share with the local planning authority only as much as is actually needed. Unlike the 2D plans and PDF documents currently in use, BIM and digital technology enable applicants to segment their designs, making this possible and less prone to mistakes.
Ricketts says that next steps will revolve around using the Gateway One process that’s come out of the Hackitt Report and the Building Safety Bill, and will involve extracting and digitising important fire safety information from architecture and engineering models. The Building Safety Bill specifies that information must be held securely on a digital platform, and it’s much more sensible to start doing it now, rather than having to catch up later once it’s all gone through the paper-based planning system. The importance, value and scalability of digitising these building safety processes will be evident to everyone.
The possibility of digitising our built environment clearly holds the potential for a wide array of valuable benefits. Miranda Sharp has been working on a demonstrator project focused on information share between utility providers in order to facilitate better decision making. She says that sharing data in this way could have a significant impact on our progress towards net zero. Another example is Gavin Stark’s Icebreaker One project, which seeks to improve efficiency in data sharing. One challenge surrounding the issue is that bilateral sharing arrangements can actually create a significant amount of difficulty for other people. Sharp says that what we really need is something more free-flowing, and adds that we should be nudging the market in the right direction as it develops so that we end up with an optimally connected set of systems.
However, we’re also going to need to definitively demonstrate that the sharing of data will lead to capturable benefits, because such benefits are distributed, and until we demonstrate that, people won’t be motivated to take things forward. We want the people who invest to continue to invest. Sharp points out that we use a lot of energy to make water, and a lot of water to make energy. We need to understand which is the most economically efficient, and which is the best for the environment. Having clear, data-based answers, will help to make regulatory and policy decisions. The more information we have, the better outcomes we’ll be able to achieve.
Another potential benefit of digitising the built environment will be a positive impact on the use of modern methods of construction. Above all else, developers are looking for certainty and speed. Sharing data across the design, engineering, planning and construction elements of development will enable us to push and promote modern methods of construction with all of the associated sustainability and energy benefits. When design and planning happen more quickly, people will want to see construction happen faster too. MMC won’t need to be enforced, developers will come on board willingly because MMC will provide much quicker outcomes, with much greater levels of certainty. Ultimately, when we make MMC and design for precision manufacturing the default option, we’ll be able to make better homes for people.
At Bryden Wood we’re currently working on the New Hospitals Programme, which is the first real enactment of the Construction Playbook. Within the lifetime of that programme, we hope to have fundamentally changed the way physical building is done, the way we use MMC, and the way we deliver assets. Such a success would pave the way for P-DfMA and other MMC methodologies to be rolled out across other social infrastructure, including schools, social housing and more. We envision that there is likely to be about a ten-year window of opportunity here, and the industry needs to make a start. It also seems likely that within the grand scheme of things on the horizon, the digital work will take longer than the physical aspects, and we should be conscious of that.
Miranda Sharp reminds us that digitising planning and the wider built environment won’t be easy to do. Some of the necessary work will be boring, and it will be a grind to make the data interoperable and set the transaction mechanisms. Aligning standards, cleaning up data and creating transformation will all take a long time. The industry wants a silver bullet, it wants to skip to the end, but there’s some fairly heavy lifting to be done first. Sharp says we need to get better about taking the quick wins. The sooner we start, the sooner we’ll reach our goal. The rewards will be worth it.
To learn more about our Design to Value approach to design and construction, sign up for our monthly newsletter here: http://bit.ly/BWNewsUpdates