Building Information Modelling (BIM) for content

Building Information Modelling (BIM) is an increasingly popular technique used in the construction industry. It involves creating XML digital models of buildings and tunnels during each stage of a project. However, these are more than just 3D animated models, as they also embed information about physical objects in the building. According to Wikipedia:

“A building owner may find evidence of a leak in his building. Rather than exploring the physical building, he may turn to the model and see that water valve is located in the suspect location. He could also have in the model the specific valve size, manufacturer, part number, and any other information ever researched in the past, pending adequate computing power. “

It means architects and engineers can “see” behind walls and discover if there are any pipes or cables that might be affected by any planned works.

This concept of an intelligent model that can be shared between stakeholders throughout the whole lifecycle is also the future for content. Organisations want the ability to know how different items of content are related, what is the structural and metadata information behind the presentation layer and how content has developed chronologically. They want the ability to use a model to plan and modify before they start the more costly work of implementation.

BIM could perhaps provide a useful analogy for Technical Authors, procedures writers, and others developing text-based content, when they are explaining the purpose and value of structured content, single sourcing and Component Content Management Systems.

Teachers need content management systems, too

The Guardian has an article today called “Teachers and parents criticise ‘robotic’ software-generated school reports“. It explains teachers are finding report writing software isn’t meeting their needs:

“It often frustrated as none of the options would quite capture what he wanted to say about a child and the end product was never satisfactory.”

It states, as an alternative, some teachers have a comment bank, which they use to cut and past into school reports. One teacher said

“I’ve got a bank of literary comments, maths comments and general comments. You can pick one that sounds about right, whip it out and plonk it in.”

A better solution might be a content management system that could contain a single-sourced comment bank, templates and some advice of what to write where.

As the spokesman for the National Association for Head Teachers said:

“Headteachers invest a lot of time and effort into making sure this happens. Technology can help that process but it should never get in the way of a truly personal report for each and every child in the school.”

Is it possible for Technical Authors to write content more quickly?

Approximately 50% of a Technical Author’s day is spent writing. However, when Technical Publications teams look for efficiencies, they tend to focus on the 50% of time spent on non-writing activities, such as researching, reviewing and planning. They assume the content itself cannot be written more quickly. To an extent, they are right, as the querty qwerty keyboard is not an optimal layout.

We’ve been going through a process of transcribing our early e-learning modules, in order to have scripts upon which we can base future course updates. As part of this project, we’ve been using a free application called Plover to help us write the content. With Plover, you have the potential to create content (in Word, RoboHelp, Flare, Oxygen XML etc) at up to 225 words per minute (wpm).

Plover is based on chorded typing. You press more than one key at a time to create words. Chorded typing isn’t new – for example, it was demonstrated in Douglas Engelbart’s famous “The mother of all demos“.

Below is a five minute lightning talk on Plover and some of the emerging hardware:

So far, in my case, I’ve been able to double my typing speed. Realistically, those of us participating in this project at Cherryleaf aim to get to 180 words per minute. The reason for this is that most people speak at 160-180 wpm. At that speed, you are able to transcribe subject matter experts in real time – which means there’s no need to record an interview and then type it up at a later date.

There is a learning curve to this method, but it is based on over 100 years of theory and practice. It is tremendous fun – a bit like learning to use a querty qwerty keyboard for the first time.

Ted Nelson on the future of text

Mike Atherton, Lead Instructor at General Assembly London, tweeted a link to a 2011interview with Ted Nelson on the future of text, document abstraction and transclusion.

Ted Nelson is one of the pioneers of information technology. He has been credited as being the first person to use the words hypertext and hypermedia (although he denies this), transclusion and virtuality.

Ted Nelson on the Future Of Text, Milde Norway, October 2011 from Frode Hegland on Vimeo.

It’s an interesting description of how content should be independent of format and media, so it can be portable, re-usable and presented in ways that best suit a reader’s needs.

Our Webinar in August will be: From Technical Communication to Content Strategy

MadCap Software has asked me to present, as a webinar, one of my conference presentations from the  MadWorld 2014 conference – Bust a Move: From Technical Communication to Content Strategy.

In this webinar, we’ll look at how technical communicators can get more involved in corporate content strategy. We’ll look at why they might want to do that, the differences between technical communication and content strategy, as well as looking at how they might re-position themselves. We’ll also look at what tools and skills technical communicators can bring across from the technical communications field.

It was a popular session at the conference, standing room only in fact, with 20 minutes of questions from the audience at the end.

This webinar will be held on the 12th August at 4.00pm BST (8:00 am Pacific Time), and it’s a free event.

I’m afraid there was some confusion over which presentation from MadWorld 2014 we would be repeating, and I mistakenly stated the webinar would be on metrics. That was my mistake. We’ve also had to move the webinar from its original date on the 13th August to the 12th August. Sorry for any confusion caused by these changes.

Ellis

Assessing the potential savings from single sourcing

One of the main benefits from single sourcing is the ability to reuse existing content. Different departments can avoid duplicating work, which means they can save time and money.

Unfortunately, it can be difficult to quantify these savings before you move to an authoring or content management system that enables you to single source. Analysing all the existing documents in a business can be overwhelming, which means often organisations only quantify the savings after the single sourcing content management system has been implemented.

There are a few software applications that can help you analyse your existing content and determine how much duplication exists. You get a sense of how much time and effort was wasted in the past, which is a pretty good indication of how much waste you’d avoid in the future.

Continue reading

Better content through analytics

At this week’s London Agile Content Meetup, Lana Gibson of the Government Digital Service (GDS) outlined how they use Google Analytics extensively to check and improve the user journey on the GOV.UK website. She said GDS treats this analytical data as the voice of their users – with GDS needing to interpret it and provide what we, as UK citizens, need.

Lana said they need to see what content is getting the most traffic, so that they can ensure that the most popular content is of top quality, and is prioritised within the site.

One of the key actions analytics have enabled them to do is improve the connections between different but related needs that were already on GOV.UK. She showed the example of the page views to the “Make a SORN” page. The number of views increased by 70,000 in a month due to them simply adding a link to this page from the car tax related links section. Previously, SORN information wasn’t mentioned on the car tax page.

She also said they treat searches on the GOV.UK website itself as an indication that users haven’t found what they’re looking for first time. As an example, she said by looking at search terms she discovered lots of people were searching for information about taking rest breaks at work, and that they’d omitted that from the page about an employee’s contract and working hours.

Another example she gave was they’d found, on many of the pages about passports, people were searching for “second passport”. This was by people wanting to apply for a second passport. GDS has identified this as a topic that should be added to the site.

Lana said they also optimise the pages based on the language their audience is using. They found having the most important keywords in the title or first sentence helped people find information quickly. GDS uses analytics, Google Trends and Google AdWords to help them understand what terminology people use. For example, she said they found out their page on annual leave needed a better title: users were actually searching for “holiday entitlement”.

Finally, she said they also use the data to determine what to leave out. If a department wants to add new content to the site, they can use analytics to help assess if there’s actually a need for this content.

Lana’s presentation has been summarised in two excellent blog posts on the GDS website. They are well worth reading: