The Future of Work for Information and Knowledge Professionals

To celebrate the tenth anniversary of NetIKX, the November 2017 meeting was opened free of charge to people from related knowledge and information management organisations. It featured two speakers, Peter Thomson and Stuart Ward, and an extended panel session with questions from the audience.

Peter Thomson on the changing world of work

Peter Thomson is a consultant who advises clients about how to create a corporate culture supporting new and better working practices. He is the author of ‘Future Work’, a visiting Executive Fellow at Henley Business College, and the Research Director of the Telework Association. Amongst his current engagements, he is helping the Health Foundation to launch a community of 1,000 health improvement practitioners, and working with Medécins sans Frontières on developing a knowledge-based evaluation process for their humanitarian response missions.

Thirty years ago he worked for Digital Equipment Corporation (DEC), known for its PDP and Vax lines of multi-user minicomputers. DEC experimented with new ways of networking, and with long-distance working. Surely, they thought, nobody in the 21st century would come to work getting stuck in traffic jams or packed into suburban trains – they would be sitting comfortably at home and ‘teleworking’ – it was a big buzzword at that time, but is now pretty much extinct.

With the benefit of hindsight, Peter notes that technology has changed but people haven’t. Human behaviour is full of ingrained habits, especially so amongst leaders of organisations. So we have the absurdity of people being forced to commute to sit at a desk, and send emails to the person a few metres away.

The younger generation is beginning to question why we continue with these outmoded working practices. The absurdity persists because business leaders want their team around them, under their eye, in a ‘command and control’ notion of how to run a business.

New tech, old habits

He asked: most of the audience have a smartphone, yes? How many had used it that day to actually telephone somebody – compared to sending or reading email, or other text based message? A show of hands showed that the latter was more prevalent than the former.

Although mobile devices and related technologies are now part of our everyday lives, and the world has become more complex, many of our practices in the world of work are still trying to catch up. Businesses may boast of being Agile, but many of the actual processes are Fragile, he said.

Business communication is spread across a spectrum of behaviours. People still like to get together physically in groups to discuss things. In that setting they employ not only words, but also nuances of expression, gesture, body language and so on. At the other end of the spectrum is email: asynchronous, and text-based (with quite a narrow medium of expression). Ranged in the middle we have such things are videoconferencing, audio conference calls, Skype etc.

Daily business communication is conducted mostly by typing emails – probably slowly – then checking for typing mistakes. Wouldn’t it be quicker to use new technology to send a quick video message? It’s technically possible these days. Look at how people have adopted WhatsApp in their personal lives. But the corporate default is face-to-face physical meetings, email at the other end, and nothing in between. Indeed, the social media tools by which people communicate daily in ordinary life are banned from many workplaces. And then people complain of having too many emails and too many meetings.

Tyranny of 24/7 and the overwork culture

Many people today are the victims of ‘presenteeism’. If you are not already at your desk and working when your managers show up, or if you leave your desk before they do, they won’t be impressed. They can’t sack you for sticking to the hours you are contracted for, but you’ll probably be passed over for promotion. Even if you’re the one who comes up with the best creative ideas, that’s regarded as incidental, secondary to the quantitative measure of how many hours you work.

This has now extended into, or been replaced by, a form of virtual presenteeism. Knowledge and administration work can now be done anywhere. So now we have digital presenteeism, 24/7. ‘Please take your laptop on holiday with you, in case there’s an emergency – and check in every day.’ Or, ‘I tend to send emails over the weekend, and while I don’t insist that you reply to them immediately, those who do are the ones I’ll favour.’ These leadership behaviours force people to be in work mode round the clock. It all builds stress, which the World Health Organization says is a 21st century epidemic.

But many people now won’t work under these conditions. They’d rather quit and set up their own business, or join the ‘gig economy’. They want to own their own time. If you got used to budgeting your time how you see fit, for example at university, you don’t want to be treated like a child by an employer, and be told when to do what.

The typical contract of employment is not about what you achieve – it’s about your hours of work, and you are paid according to the time you spend at work. For example, consider a mother who returns from maternity leave and agrees to work a four-day week rather than five. She benefits from having the extra time; the employer may also benefit, because in trying to get the same work done in four rather than five days a week, she probably skips unproductive meetings and spends less time chatting. But after a while, she finds that however productive she is, she’s being paid four-fifths of what her colleagues are.

At a national level, Peter commented, Britain is quite low on the productivity scale, and yet our working hours are so long.

Challenges to permanent employment

There has been a trend towards outsourcing and subcontracting: consider call centres in India or companies having products made in China. Will there now be a second wave of this, at the management and administration level, in which inefficient layers are taken out of corporate organisations and the organisation gets the professional inputs it needs from subcontractors?

We’re seeing the collapse of the pension paradigm. The conventional model is predicated on the idea of ‘superannuation’, and a few years of retirement before you die. But with today’s longer lifespans, thinking of seniors as being too old to contribute knowledge and skills is increasingly untenable — and anyway, it’s proving impossible to fund a long retirement from the proceeds of 40 years of employment. Nor can the State pension scheme fund extended pensions from the taxes paid by (a declining proportion of) younger people in work. Is retirement then an antiquated idea?

Peter closed by wondering what people’s ideal workplace might be — where they are at their most creative. Within the audience, people mentioned a variety of domestic settings, plus walking and driving. Peter imagines the organisation of the future as a network of people, working in such conducive conditions, and connected by all the liberating means that technology can bring us. Are we ready for this new world?

Stuart Ward on KIM and adding value to organisations

Stuart was the first chair of NetIKX and has been involved with our community throughout. The first meeting of NetIKX was addressed by David Skyrme, who spoke about the value of knowledge and information management (KIM for short, hereafter); that would also be the main focus of his presentation. He believes that it can be challenging for KIM professionals (KIPs) to prove their value to the organisations in which they work.

Knowledge and information are the life-blood of organisations; those who use them well will prosper, those who don’t will wither. From that, one might expect the KIPs to be highly valued, but often it is not the case.

Stuart identifies four things he thinks are important if KIPs are to survive and flourish in an organisation. They are: to focus on creating value; to link KIM activities to the organisation’s goals and objectives; to be clear about everyone’s responsibilities in relation to KIM (and there are various such roles, whether in creating and disseminating information products, or managing data and information resources). Finally, the organisation must have the right structures, skills and culture to make best use of what KIM can provide.

‘Value’ means different things in different enterprises. Commerce will focus on value for shareholders and other stakeholders, and customer service. In the public sector, and for non-profits, value could mean packaging information so that citizens and other service users can make best use of it.

A six-part model

Stuart has long promoted a model that is structured around six mechanisms through which information and knowledge can be used to deliver value to an organisation. They are:

  • developing information and knowledge products which can be marketed;
  • helping to drive forward the business strategy;
  • enabling organisational flexibility and change;
  • improving corporate decision making;
  • enabling and improving all business processes, especially the key ones; and
  • enhancing the organisation’s ‘intellectual capital’.

Looking at these in turn…

Information and knowledge products: Some businesses (publishers, media companies, law firms etc) create products for sale to the public or to other businesses. Others, such as local government or non-profits, produce reports and studies: though not for sale, these are crucial in their work of informing the public, influencing government or what have you.

Driving business strategy and renewal: Organisations often must change to survive, and here KIM can deliver value by enabling innovation. Apple Computer almost hit the rocks a couple of times, but through KIM and innovative product and service design became highly profitable. It’s important to sense the direction the market is headed: Blackberry and Nokia are examples of companies which failed to do that.

Enabling organisational change and flexibility: Good KIM helps an organisation to be sensitive to changing business opportunities and risks, to improve efficiency and cut costs. Here, the last thing one needs is to have knowledge trapped within silos. Efficient sharing of knowledge and information across the organisation is key.

Improving decision making: The mantra is, ‘Get the right information at the right time, to the right people.’ Good decision making requires an understanding of options available, and the consequences of making those choices – including the risks. Bad decisions are often made because of the prejudices of the decision-makers, who have the power to prevail in the face of evidence, so it’s important that the organisation has the right cultural attitudes towards decision-making, knowledge and information.

Continuous improvement: Almost always, business processes could be done better, and proper curation of information and knowledge is the key to this. Good ideas need not be limited to a discrete business process, and can inspire changes in other activities.

Enhancing intellectual capital: One of the most important realisations in KIM is that, just as money and equipment and premises and people are assets to a business, so are information and knowledge, and they should be managed properly. Yet many organisations don’t have an overview of what those intellectual assets are. As engineer Lewis Platt, 1990s CEO of Hewlett-Packard once said, ‘If we only knew what we know, we’d be three times more profitable.’ (Platt was also famous for his practice of ‘management by walking around’, immersing himself in the operational side of the business rather than staying aloof in his office.)

Linking KIM to key goals: the benefits matrix

Stuart then proposed a methodology for fitting IM and KM to an organisation’s key goals and objectives. As a tool, he recommends a ‘benefits matrix’ diagram, somewhat like a spreadsheet, in which the row headings define the organisation’s goals, its aims and objectives, while column headings define existing or possible future KIM services and capabilities. This creates an array of cells, each one of which represents how one particular KIM service or capability maps to one of the organisation’s goals. In these cells, you can then list the benefits, which may be quantifiable (e.g. increased income, reduced cost) or unquantifiable.

Stuart gave the example of an organisation having a Document Management System (represented as a column on the matrix). How might that map across to the company’s goal of reducing overhead costs? Well, a quantifiable result might be the saving of £160K a year, while unquantifiable benefits could include faster access to information, and a reduced risk of losing critical information.

Stuart likes this kind of exercise, because it stimulates thinking about the way in which information and knowledge management initiatives can generate benefits for the organisation’s self-identified objectives. It isn’t narrow-minded with a focus only on quantifiable benefits, though it does stimulate thinking about how one might define metrics, valuable for monitoring results. Finally, it is strongly visual and easy to assimilate, and as such is good for engaging with senior management.

Responsibilities, capabilities

It would be a mistake to assume that KIM responsibilities attach only to people explicitly employed for a KIM role. Business leaders also have strategic responsibilities for KIM, and every ‘ordinary’ worker has KIM responsibilities too. There are special defined responsibilities for those with ‘steward’ roles, for those who create and manage information and knowledge resources as their main job, and also those who manage IT and other kinds of infrastructure and services which help to manage and deliver the resources to where they are needed.

Stuart’s slide set included several detailed bullet-point slides for the KIM responsibilities that might attach to these various roles, but we skipped over a detailed examination these due to pressure of time. [The slide set is available at www.netikx.org to NetIKX members and to those who attended the meeting.]

A cyclical process

Stuart’s final diagram suggested that there is a cyclical process between two spheres: the first sphere represents the organisation’s data and information, and what it knows. Through good management and use of these resources, the organisation hopefully performs well and succeeds in the second sphere, that of action. By monitoring performance, the organisation can learn from experience, and that feeds back to the first sphere.

Learning from the Hawley Committee

In 1995 the Hawley Committee, chaired by Sir Robert Hawley, under the auspices of the KPMG IMPACT Programme, published the results of an investigation into information management, and the value assigned to information, in several large UK businesses. Entitled ‘Information as an Asset – The Board Agenda’, the report set out ten agenda points, of which three have to do with responsibilities of the Board of Management. CILIP have recently shown a renewed interest in the Hawley Report, and may soon republish it, with updates to take account of changes between then and now.

Panel Q&A session

The panel discussion session had something of a BBC ‘Any Questions’ flavour. Before the session, people sat in table groups and came up with written questions, which Panel chair David Penfold then collected and collated during our tea break. David then called for questions which were similar to be put to the panel, which consisted of Noeleen Schenk of Metataxis, David Gurteen, David Smith (Government KIM Head of Profession), Karen McFarlane (Chair of CILIP Board), David Haynes (chair of ISKO UK) and Steve Dale.

Will KIM professionals become redundant?

Stuart Ward asked for the panel’s opinion about whether knowledge and information management professionals might soon be redundant, as their skills are diffused to a wider group of people through their exposure to technology in school and university. Joanna asked how we can create knowledge from the exponentially growing amount of information; and Alison wondered if the information available to all on Wikipedia is good enough; or are we looking for a perfect solution which doesn’t exist?

Steve Dale responded mostly to Stuart’s question. He has observed how in most of the organisations with which he works, KIM functions are being spread around the organisation. Organisations like Cadbury’s, the British Council and PWC no longer have a KIM department per se. Knowledge has become embedded in the organisation. But Steve still sees a future role for KIM professionals (or their equivalent – they may be called something else) as organisations turn to machine-augmented decision-making, ‘big data’, and machine learning.

Consider machine learning, in which computer systems are fed with truckloads of data, and process that to discover patterns and connection. If there is bias in the data, there will be bias in the outcomes – who is checking for that? This is where wise and knowledgeable humans can and should intervene to manage the quality of the data, and to ensure that any ‘training set’ is truly representative.

Karen McFarlane also responded to Stuart’s question. With her GCHQ and National Cybersecurity Centre background, she sees a continued and deepening need for skills in data and information governance, information assurance and cyber-security; also in information risk management, and data quality. KIM professionals have those kinds of skills. As for Stuart’s assertion that exposure to technology at university is enough to impart those skills – she thinks that is definitely not the case. Such people often don’t know how to manage the information on their desktops [let alone at an enterprise level].

Noeleen Schenk in contrast replied that she didn’t think it should be necessary to teach people how to manage the information they work with, so long as there were rule-based technical systems to do information governance automatically (for example, though auto-categorisation). But who will write the rules? That’s where the future of KIM work may lie.

David Haynes offered the perspective of someone teaching the next generation of KIM professionals (at Dundee and at City University of London). He is impressed by the diversity of backgrounds among people drawn to take courses in Library and Information Studies, and Archives and Records Management: it shows how relevant these skills are to many fields of human activity. He would like to see in future a larger emphasis on information governance skills, because many LIS-trained people go on to take up roles as data protection officers, or working on compliance matters.

David Smith thinks KIM professionals do risk extinction if they don’t change. He remembers that when he joined the Civil Service, doing online search was an arcane craft for specialists – that’s gone! He agrees that information governance is a key skill. Could anyone do that? Yes… Would they make mistakes? Certainly! This is where KIM professionals should be asserting their skills and gaining a reputation for being the go-to person for help in such matters.

David Gurteen doesn’t think the need to manage information and knowledge will go away – quite the reverse. One recently-arising topic has been that of ‘fake news’ and bias, which for him highlights the need for information and media literacy skills and judgement to be taught and learnt.

Training the robots?

Claire Parry referred to the greater availability today of information which has been structured to be ‘understandable’ by machines as well as by humans. What did the panel think might be KIM professionals’ roles in training the machines, and dealing with ‘bots’.

Steve Dale said that artificial intelligence has been a big interest of his in recent years. A lot of young people are out there coding algorithms, and some machines are even crafting their own through machine learning. That’s fine for game-playing, but in matters of more importance, affecting the lives of citizens, we must be concerned when machines evolve their own algorithms that we don’t understand. The House of Commons Science and Technology Committee is requesting information from organisations creating these kinds of tools, so they can consider the implications. Steve said that, when some algorithm is being used to augment decision-making in a way which affects him, he wants to know about it, and about what data is being used to inform it.

Rob Rosset wondered whether it is possible to create an algorithm that does not have some form of bias within it. David Gurteen thought ‘bias’ was inevitable, given that programming always proceeds from assumptions.  Noeleen Schenk thought that good data governance could at least reveal to us the provenance and quality of the data being used to inform decisions. David Haynes agreed, and referred to the ‘ethics of information’, noting that CILIP’s model of the Wheel of Knowledge places ethics at the very centre of that diagram.

Steve Dale mentioned he had just been at an event about whether AI will lead to job losses, and people there discussed the algorithms that Facebook uses. Facebook now realises that they can’t detect online abuse algorithmically, so are in the process of recruiting 10,000 humans to moderate the algorithms! So the adoption of AI maybe bringing up job opportunities?

The gig economy; face-to-face vs virtual working

David Penfold brought forward four questions which he thought might be related to each other.

Kathy Jacob asked, what impact will ‘gig economy’ workforce arrangements have on knowledge and information work in the future? Particularly in the aspects of knowledge creation and use. Valerie Petruk asked, is the gig economy a necessary evil? Sarah Culpin wondered how to get the right balance between face-to-face interactions and virtual working spaces; and Jordana Moser similarly wondered how we organise to meet human needs as well as the demands of efficiency and productivity. For example, a face-to-face meeting may not be the most efficient way of getting work done, but has value on other levels.

David Smith thought that the ‘gig economy’ probably is a necessary evil. Records management for government has become increasing commoditised: when a task emerges, you buy in people to do it, then they go. It’s a balancing act, because some work is more appropriately done by people on the payroll, and some doesn’t have to be. Procurement skills have therefore become more important – deciding what work you keep in-house, and what you farm out, or get people in for on a temporary basis.

David Haynes noted the loss of rights that comes along with the ‘gig economy’ – being employed has benefits for people. He himself has been both employed and self-employed – it’s worked out just fine for him, but people engaged for more routine tasks can be easily exploited; when they are ill, they aren’t paid; they don’t get holiday pay, etc. Peter Thompson in his talk had proposed being ‘paid by the piece’ rather than for time on the job, but David thinks that going down this path not only imposes on individuals, but brings a cost to the whole of society too.

Noeleen Schenk finds that a ‘gig economy’ approach suits her, because she likes a portfolio lifestyle. If you combine it with the Internet’s opportunities for long-distance working, it’s brilliant that an enterprise can find someone with just the skills they want, who can provide that service from the other side of the world.

Moving to address Kathy Jacobs’s question directly, Noeleen thinks that knowledge capture will move from writing things down towards voice capture, plus voice-to-text conversion, such that there will be fewer low-grade tasks to be assigned to temporary workers. However, what gig work methods do risk losing is the organisational knowledge that comes with continuity of shared experience in the enterprise.

Karen McFarlane said that we need both face-to-face and distant working. We are humans; we work best in a human way. We can blend in virtual meetings, virtual communities; but these virtualised relationships always work best if you have met the other person(s) in real life.

David Gurteen is definitely in favour of face to face conversation. He has been experimenting holding his Knowledge Café meetings using Zoom technology, but he thinks, if you can meet face to face, it’s better. Doing it remotely is something you do if you have to. Nancy Dixon talks about the ‘oscillation principle’ – if you have a geographically dispersed team, every so often you have to bring them together (see her blog post at https://www.druckerforum.org/blog/p=881 – she talks about ‘blend[ing] sophisticated virtual tools with periodic, in-depth, face-to-face collective sensemaking.’)

Recruitment criteria, and the robots (again)

Judith Stuart, who lectures at the University of the West of England, asked what skills and knowledge the panel look for in new recruits and appointments to knowledge management roles in organisations.

David Haynes replied in one word: ‘flexibility’, and other panelists agreed, David Gurteen would add ‘curiosity’. Noeleen’s answer was similar – adaptability, and the ability to cope with uncertainty.

Karen McFarlane said that when she used to recruit people to roles in records, information or knowledge management, she looked out for people who had a love of information. Yes, flexibility was also amongst her criteria, but also the inter-personal skills to be able to work in a team.

David Penfold thought it was interesting that no-one had mentioned professional skills! Karen replied that of course those were required, but her response to the question was about what would make a candidate ‘stand out’ in her eyes. Noeleen added that professional skills can be learned, but the softer skills can’t so easily.

Steve Dale referred to a company he would shortly be meeting, called HeadStart, which is using artificial intelligence and machine learning working on data (such as exam results, social media interventions) to identify candidates for organisations. They claim to shorten the time and lower the cost of getting the right people into the right jobs. He has been wondering how they would know what ‘a good result’ or ‘a bad result’ looks like…

David Haynes noted that the new data protection regulation will give people the right to challenge how decisions are made by automated systems, and to insist on human intervention if they don’t like the basis on which decisions are made.

Is it good to be averse to risk?

Anna Stothard asked for top tips or recommendations for changing a risk-averse culture, and getting more buy-in to new ideas from senior management.

David Smith remarked that government is keen on risk-aversion! Indeed the best way to get civil service management attention is to say, ‘If you want to avoid risk, do this.’ If he tells them about various benefits that a new approach could bring, he’ll be politely ignored. If he describes all sorts of bad things that could be avoided – then they are all ears (though one shouldn’t overdo it).

It all depends on your organisational culture; you need to assess management’s appetite for risk, and to make sure people understand the nature of the risks. He gave the example of a local government organisation that had turned down a Freedom of Information request on the grounds that it was ‘impertinent’, when what was underlying the response was a risk-averse culture.

Steve Dale said that in his consulting role he has had to go try to convince senior management that a change would be beneficial. His rule of thumb is to pay attention to Return on Investment (ROI); if the investment can be kept modest, the proposal is more likely to find favour.

Noeleen Schenk generally prefers to argue for change because of the benefits it will bring, but she had recently worked with a client where the concern was mostly about risk. So the project on which she was working was converted from a ‘value adding’ one to a ‘risk reduction’ one instead.

The role of knowledge organisation?

Keri Harrowven asked what role knowledge organisation plays in promoting knowledge and information management in the workplace.

Noeleen Schenk replied that, for Metataxis, knowledge organisation has a central role. But many people regard KO as an overhead, and an unnecessary expense. It takes time and effort to get KO right, but Noeleen will ask – ‘If you can’t find it, why have you got it?’ She recalled a client with about 110,000 boxes of documents in offsite storage, with next to no indexing, but they insisted they wanted to keep it all – at huge cost. She asked them, could they find anything? (No.)

Just because it is hard to do knowledge organisation, doesn’t mean that you shouldn’t. She’d say – start with some architecture, then build out. In a recent project, she has started by putting in place a set of requirements about how newly generated information is handled, first.

David Haynes noted that the UK Chapter of the International Society for Knowledge Organization often visits these topics. Like Noeleen, he thinks that there is no point in hoarding information if you can’t retrieve it. That leads to such KO questions as how you categorise information, how it is described, what metadata can be captured and attached, and what search and discovery tools you can put in place. It also goes into what the organisation’s needs are, what is the nature of the information you are faced with, and how you make that connection.

Also of increasing importance is how we can exploit information. Linked Data is an approach showing incredible potential, and new applications, such as combining map data with live sensor and update feeds – for example, the data revolution which helps Transport for London passengers know when their next bus is coming and where it is now. But none of these novel forms of exploitation would be possible without robust schemes for classifying and managing the information sources.

Finally, knowledge organisation is key to information governance.

Silos or outstations?

Someone asked: ‘Does having KM roles in an organisation create silos? How can move towards a more embedded approach?’

Karen MacFarlane described a hybrid approach in which her organisation had a central KIM team, which might be considered a silo; but she funded the placing of KIM professionals into teams of analysts for a year, helping them to develop their own information management skills. In every case, the teams that had had the benefit of working with the KIM professional wanted to find the funds to continue that work from within their own budgets.

Information governance directions?

Martin Newman wanted to know where panellists thought information governance was going, as the two initial speakers seemed to be predict new ways of working in which information roles would be decentralised.

David Haynes replied that KIM professionals are increasingly being tasked with data protection and information governance framework development. But he doesn’t think that they can work on their own. They have to work with the people on the legal side, and the people delivering the technology. It doesn’t really matter who is ‘in charge’, so long as there is that sort of coalition, and that it is embedded within the organisation.

Noeleen Schenk recounted noticing enormous variability in where information governance tasks are run from – sometimes from legal, sometimes from the executive, sometimes IT. Arguably, all is well if how people collaborate matters more to them than where people are sitting. She has been noticing a trend of information governance roles moving from the centre, along ‘spokes’, towards decentralised clusters of people; but it is even better if the way people work at every level supports good governance rather than it having to be done for them.

David Smith says that the culture of the civil service is already imbued with instinct to take good care of information. Yes, silos are there – he gave us a picture of ‘fifteen hundred silos in a single department, flying in close formation’. Teams have got smaller – to do with cuts, as much as anything else. Not every information asset is equally treated, depending upon risks attached. It’s a question of expedience, and balancing risk against cost.

His own department manages information about the European Regional Development Fund. If the European Court of Auditors asks to see information about any ERDF projects in the UK, his department has 72 hours to find it; or else there is a fine up to 10% of the value of the ERDF loan that financed that project. Imagine the prospect of a fine of £100,000 if you can’t find a file in 72 hours! You can bet the department has got that information rigorously indexed; whereas other areas are managed with a lighter touch, as they don’t carry the same risks.

There is also variability across government as to whether the work is done at the ‘hub’ or along the ‘spokes’.

Steve Dale pointed out that silos can exist for a reason – an example would be to maintain security in investment banking.

Globalism and process alignment

Emma Bahar had a question: ‘How can processes be managed in global organisations in which alignment is likely impossible?’

Steve Dale used to work for Reuters, with offices in every country. They managed very well in aligning their processes. Indeed, their whole business model relied on good interchange of quality information. He thought most global organisations would wither and die if there wasn’t good interchange and standardisation of processes. Yes, there will be cultural differences, and in Reuters they encountered these and learned to work with them.

Wikipedia again

David Penfold suggested returning to the question about the quality of information available on Wikipedia; are we asking too much looking for a perfect solution which doesn’t exist? Universities typically talk down Wikipedia, and students are not allowed to quote it as a reference. Is that realistic?

David Haynes pointed out that Wikipedia editing is moderated. A study some years ago compared the accuracy of Wikipedia articles against Encyclopaedia Brittanica online, and Wikipedia was found to be superior. He advises students that Wikipedia is a fantastic resource and they should use it – but not quote it! If Wikipedia gives a reference to a source document [according the Wikipedia ‘no original research’ rules, every assertion should be backed up by a citation], then go to that source and quote that. Wikipedia should be regarded as a secondary source, a good entry point into many subjects. Indeed, David uses it that way in his own research.

Noeleen Schenck hinted at possible double standards. In the analogue world we never relied on Encyclopaedia Brittanica for everything. She thought that some of the discomfiture was about how Wikipedia is authored by tens of thousands of volunteers. We should remember that enthusiastic amateurs helped to expand the boundaries of science; they are not necessarily ignorant or inept.

The panel agreed that Wikipedia should be regarded as one source amongst many. Noeleen compared this to reading several newspapers to get an angle on something in politics. How you assess sources brings us back to the topic of Information Literacy – not, perhaps, the best term for it, but as David Haynes confirmed, critical assessment of information sources is actually being taught (to KIM students, anyway).

Generational attitudes

Graham Robertson noted that Peter Thomson had talked about ‘millennials’ and their attitudes, and Graham wondered what the panel thought about the role the younger generations would play in changing attitudes, cultures and practices around KIM in organisations. Do younger people use and process information in a different way?

David Smith said he has been doing a review, where it was interesting to compare how older and younger people were sharing information within their cohorts. In the case of older members of staff, one could track discussions via email. But the younger staff members appeared to be absent. Why? It turned that they communicated with work colleagues using WhatsApp. Because it was a medium with which they were familiar, it was a quick way for them to ‘spin up’ a conversation. Of course, this poses new challenges for organisations. Discussions and information sharing are absent from the network (and apparently WhatsApp security isn’t up to much).

Noeleen Schenk thought it was a fool’s errand to try to force people to work in a way which they don’t find natural. She doesn’t know what the solution is, but we need to think afresh at important information and knowledge is kept track of – the current crop of tools seem inadequate.

Facing down ‘alternative facts’

Conrad Taylor asked: What is, or could be, the roles and responsibilities of all who work with knowledge and information – including teachers and journalists – in helping people to learn how to weigh evidence, distinguish fact from falsehood and propaganda, both in ‘big media’ and in social media?

David Haynes noted that this was increasingly a focus in meetings of KIM professionals [it was the subject of a panel session at the 2017 ISKO UK conference]. How can people be sure they are receiving unbiased information? Or if, like Steve Dale, we think that there cannot be unbiased information, we will have to be open to a range of information sources, as Noeleen had suggested.

David Penfold noted that in recent partisan political debate on social media, bots had been unleashed as disseminators of propaganda. Conrad noted that Dave Clarke of Synaptica has proposed a taxonomy of different sources of misleading information (see resources at https://davidclarke.blog/?page_id=16). The panel noted that the role being played by paid-for posts on, for example. Facebook and the way Facebook’s personalisation algorithms work are coming under closer strategy.

Conrad regretted that the term ‘Information and Knowledge Professional’ is often used to mean only people who curate information, excluding people whose job it is to create information – as writers, and designers and illustrators too. It is all too common to see data graphics that have been created in support of an editorial line, and which are misleading. (Indeed Steve Dale addressed this at a recent NetIKX meeting.)

Steve Dale remarked that we now have a new weapon to counteract ‘phishing’ attacks where fake online approaches are made in an attempt to defraud us of money, steal our identity, etc. It’s called Re:scam (https://www.rescam.org) and if you forward scammy emails to it, its artificial personalities will engage the scammers in an exchange of emails that will waste their time!

At this point, we ran out of time, but continued discussions over drinks. Face-to-face, naturally!

Advertisements
Posted in Uncategorized | 1 Comment

NetIKX Programme for 2018

The first meeting of 2018 is on Thursday 25 January:

Making true connections in a complex world: new technologies to link facts, concepts and data – Thursday 25 January 2018

A pdf giving detail of the meeting will be available shortly.

The full 2018 programme will be announced shortly.

Posted in Uncategorized | Leave a comment

Taxonomy Boot Camp London

A 25% discount on the fee at the Taxonomy Boot Camp London (17–18 October) has been negotiated for NetIKX members. See http://www.netikx.org/forum/netikx-member-discount-taxonomy-boot-camp-2017 for the discount code.

Posted in Uncategorized | Leave a comment

The Implications of Blockchain for KM and IM

Conrad Taylor writes:

The speakers at the meeting on 6 July 2017 were Marc Stephenson, Noeleen Schenk  and John Sheridan.

Marc Stephenson is the Technical Director at Metataxis. He has worked on the design, implementation and ongoing management of information systems for over 25 years, including organisations in health, central and local government, banking, utilities, new media and publishing. He has architected and implemented many IT solutions, ranging from intranets, document management systems, records management systems, and ECM portals. Marc recognises the need to design solutions that deliver maximum benefit at minimal cost, by focusing on the business, users and crucially the information requirements, rather than unnecessary technology and functionality.

Noeleen Schenk has over twenty years’ experience of working in the information sector as a practitioner, researcher and consultant. Her recent  projects have focused on all aspects of information and knowledge management – from governance to assurance, helping clients successfully manage their information and minimise the risk to their information assets. These projects include information security, information and data handling, information risk management, document and records management. In addition to working with clients, Noeleen is passionately interested in the constantly changing information and knowledge management landscape, the use of technology, and new ways of working – helping business identify critical changes, assess the opportunities then develop options and map out strategies to turn them into reality, taking advantage of the opportunities they present us.

John Sheridan is the Digital Director at The National Archives, where he leads the development of the organisation’s digital archiving capability and the transformation of its digital services. John’s academic background is in mathematics and information technology, with a degree in Mathematics and Computer Science from the University of Southampton and a Master’s Degree in Information Technology from the University of Liverpool. John recently led, as Principal Investigator, an Arts and Humanities Research Council funded project, ‘big data for law’, exploring the application of data analytics to the statute book. More recently he helped shape the Archangel research project, led by the University of Surrey, looking at the applications of distributed ledger technology for archives. A former co-chair of the W3C e-Government Interest Group, John has a strong interest in web and data standards. He serves on the UK Government’s Open Standards Board, which sets data standards for use across government. John was an early pioneer of open data and remains active in that community.

Blockchain is a technology that was first developed as the technical basis for the cryptocurrency Bitcoin, but there has been recent speculation that it might be useful for various information management purposes too. There is quite a ‘buzz’ around the topic, yet it is too complex for many people to figure out, so it’s not surprising that the seminar attracted the biggest turnout of the year so far.

The seminar took the form of three presentations, two from the consultancy Metataxis and one from The National Archive. The table group discussions that followed were simply open and unstructured discussions, with a brief period at the end for sharing ideas.

The subject was indeed complex and a lot to take in. In creating this piece I have gone beyond what we were told on the day, done some extra research, and added my own observations. I hope this will make some things clearer, and qualify some of what our speakers said, especially where it comes to technical details.

Marc Stephenson gives a technical overview

The first speaker was Marc Stephenson, Technical Director at Metataxis, the information architecture and information management consultancy. In the limited time available, Marc attempted a technical briefing.

Marc’s first point was that it’s not easy to define blockchain. It is not just a technology, but also a concept and a framework for ways of working with records and information; and it has a number of implementations, which differ in significant ways from each other. Marc suggested that, paradoxically, blockchain can be described as ‘powerful and simple’, but also ‘subtle, and difficult to understand’. Even with two technical degrees under his belt, Marc confessed it had taken him a while to get his head around it. I sympathise!</p>

The largest and best-known implementation of blockchain so far is the infrastructure for the digital cryptocurrency ‘Bitcoin’ – so much so that many people get the two confused (and others, in my experience, think that some of the features of Bitcoin are essential to blockchain – I shall be suggesting otherwise).

Wikipedia (at http://en.wikipedia.org/wiki/Blockchain)   offers this definition:

A blockchain […] is a distributed database that maintains a continuously growing list of ordered records called blocks. Each block contains a timestamp and a link to a previous block. By design, blockchains are inherently resistant to modification of the data — once recorded, the data in a block cannot be altered retroactively. Through the use of a peer-to-peer network and a distributed timestamping server, a blockchain database is managed autonomously… [A blockchain is] an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way. The ledger itself can also be programmed to trigger transactions automatically.

Marc then dug further into this definition, but in a way which left some confused about what is specific to Bitcoin and what are the more generic aspects of blockchain. Here, I have tried to tease these apart.

Distributed database — Marc said that a blockchain is intended to be a massively distributed database, so there may be many complete copies of the blockchain data file on server computers in many organisations, in many countries. The intention is to avoid the situation in which users of the system have to trust a single authority.

I am sceptical as to whether blockchains necessarily require this characteristic of distribution over a peer-to-peer network, but I can see that it is valuable where there are serious issues of trust at stake. As we heard later from The National Archive, it is also possible to create similar distributed ledger systems shared between a smaller number of parties which already trust each other.

Continuously growing chain of unalterable ‘blocks’  — The blockchain database file is a sequential chain divided into ‘blocks’ of data. Indeed, when blockchain was first described by ‘Satoshi Nakamoto’, the pseudonymous creator of the system in 2008, the phrase ‘block chain’ was presented as two separate words. When the database is updated by a new transaction, no part of the existing data structure is overwritten. Instead, a new data block describing the change or changes (in the case of Bitcoin, a bundle of transactions) is appended to the end of the chain, with a link that points back to the penultimate (previous) block; which points back to the previous one; and so on back to the ‘genesis block’.

One consequence of this data structure is that a very active blockchain that’s being modified all the time grows and grows, potentially to monstrous proportions. The blockchain database file that maintains Bitcoin has now grown to 122 gigabytes! Remember, this file doesn’t live on one centralised server, but is duplicated many times across a peer-to-peer network. Therefore, a negative consequence of blockchain could be the enormous expense of computing hardware resources and energy involved in a blockchain system.

(As I shall later explain, there are some peculiar features of Bitcoin which drive its bloat and its massive use of computational resources; for blockchains in general, it ain’t necessarily so.)

Timestamping — when a new block is created at the end of a chain, it receives a timestamp. The Bitcoin ‘timestamp server’ is not a single machine, but a distributed function.

Encryption — According to Marc, all the data in a blockchain is encrypted. More accurately, in a cryptocurrency system, crucial parts of the transaction data do get encrypted, so although the contents of the blocks are a matter of public record, it is impossible to work out who was transferring value to whom. (It is also possible to implement a blockchain without any encryption of the main data content.)

Managed autonomously — For Bitcoin, and other cryptocurrencies, the management of the database is done by distributed software, so there is no single entity, person, organisation or country in control.

Verifiable blocks — It’s important to the blockchain concept that all the blocks in the chain can be verified by anyone. For Bitcoin, this record is accessible at the site bitcoin.info.

Automatically actionable — In some blockchain systems, blocks may contain more than data; at a minimum they can trigger transfers of value between participants, and there are some blockchain implementations – Ethereum being a notable example – which can be programmed to ‘do’ stuff when a certain condition has been met. Because this happens without user control, without mediation, all of the actors can trust the system.

Digging into detail

In this section, I am adding more detail from my own reading around the subject. I find it easiest to start with Bitcoin as the key example of a blockchain, then explore how other implementations vary from it.

‘Satoshi Nakamoto’ created blockchain in the first place to implement Bitcoin as a digital means to hold and exchange value – a currency. And exchange-value is a very simple thing to record, really, whereas using a blockchain to record more complex things such as legal contracts or medical records adds extra problems – I’ll look at that later. Let’s start by explaining Bitcoin.

Alice wants to pay Bob. Alice ‘owns’ five bitcoins – or to put it more accurately, the Bitcoin transaction record verifies that she has an entitlement to that amount of bitcoin value: the ‘coins’ do not have any physical existence. She might have purchased them online with her credit card, from a Bitcoin broker company such as eToro. Now, she wants to transfer some bitcoin value to Bob, who in this story is providing her with something for which he wants payment, and has emailed her an invoice to the value of 1.23 BTC. The invoice contains a ‘Bitcoin address’ – a single-use identifier token, usually a string of 34 alphanumeric characters, representing the destination of the payment.

To initiate this payment, she needs some software called a ‘Bitcoin wallet’. Examples are breadwallet for the iPhone and iPad, or Armory for Mac, Linux and Windows computers. There are also online wallets. Users may think, ‘the wallet is where I store my bitcoins’. More accurately, the wallet stores the digital credentials you need to access the bitcoin values registered in the blockchain ledger against your anonymised identity.

Launching her wallet, Alice enters the amount she wants to send, plus the Bitcoin address provided by Bob, and presses Send.

For security, Alice’s wallet uses public–private key cryptography to append a scrambled digital signature to the resulting message. By keeping her private key secret, Alice is guaranteed that no-one can spoof Bitcoin into thinking that the message was sent to the system by anyone other than her. The Bitcoin messaging system records neither Alice’s nor Bob’s identity in the data record, other than in deeply encrypted form: an aspect of Bitcoin that has been criticised for its ability to mask criminally-inspired transactions.

At this stage, Alice is initiating no more than a proposal, namely that the Bitcoin blockchain should be altered to show her wallet as that bit ‘emptier’, and Bob’s a bit ‘fuller’. Implementing computers on the network will check to see whether Alice’s digital signature can be verified with her public key, that the address provided by Bob is valid, and that Alice’s account does in fact have enough bitcoin value to support the transaction.

If Alice’s bitcoin transaction proposal is found to be valid and respectable, the transaction can be enacted, by modifying the blockchain database (updating the ledger, if you like). As Marc pointed out, this is done not by changing what is there already, but by adding a new block to the end of the chain. Multiple transactions get bundled together into one Bitcoin block, and the process is dynamically managed by the Bitcoin server network to permit the generation of just one new such block approximately every ten minutes – for peculiar reasons I shall later explain.

Making a block: the role of the ‘hash’

The blocks are generated by special participating servers in the Bitcoin network, which are called ‘miners’ because they get automatically rewarded for the work they do by having some new Bitcoin value allocated to them.

In the process of making a block to add to the Bitcoin blockchain, the first step is to gather up the pending transaction records, which are placed into the body of the new block. These transaction records themselves are not encrypted, though the identities of senders and receivers are. I have heard people say that the whole blockchain is irreversably encrypted, but if you think about it for a second, this has to be nonsense. If the records were rendered uninspectable, the blockchain would be useless as a record-keeping system!

However, the block as a whole, and beyond that the blockchain, has to be protected from accidental or malicious alteration. To do this, the transaction data is put through a process called ‘cryptographic hashing’. Hashing is a well-established computing process that feeds an arbitrarily large amount of data (the ‘input’ or ‘message’) through a precisely defined algorithmic process, which reduces it down to a fixed-length string of digits (the ‘hash’). The hashing algorithm used by Bitcoin is SHA-256, created by the US National Security Agency and put into the public domain.

By way of example, I used the facility at http://passwordsgenerator.net/sha256-hash-generator to make an SHA-256 hash of everything in this article up to the end of the last paragraph (in previous edits, I should add; I’ve made changes since). I got 9F0B 653D 4E6E 7323 4E03 B04C F246 4517 8A96 DFF1 7AA1 DA1B F146 6E1D 27B0 CA75 (you can ignore the spaces).

The hash string looks kind of random, but it isn’t – it’s ‘deterministic’. Applying the same hashing algorithm to the same data input will always result in the same hash output. But, if the input data were to be modified by even a single character or byte, the resulting hash would come out markedly different.

Note that the hash function is, for all practical purposes, ‘one-way’. That is, going from data to hash is easy, but processing the hash back into the data is impossible: in the case of the example I just provided, so much data has been discarded in the hashing process that no-one receiving just the hash can ever reconstitute the data. It is also theoretically possible, because of the data-winnowing process, that another set of data subjected to the same hashing algorithm could output the same hash, but this is an extremely unlikely occurrence. In the language of Bitcoin, the hashing process is described as ‘collision-resistant’.

The sole purpose of this hashing process is to build a kind of internal certificate, which gets written into a special part of the block called the ‘header’. Here, cryptography is not being used to hide the transaction data, as it might in secret messaging, but to provide a guarantee that the data has not been tampered with.

Joining the hash of the transaction data in the header are some other data, including the current timestamp, and a hash of the header of the preceding block in the chain. These additions are what gives the blockchain its inherent history, for the preceding block also contained a hash of the header of the block before that, and so on down the line to the very first block ever made.

The role of the ‘miner’ in the Bitcoin system

Now, as far as I can tell, there is nothing in principle wrong with having the blockchain-building process run by one trusted computer, with the refreshed blockchain perhaps being broadcast out at intervals and stored redundantly on several servers as a protection against disaster.

But that’s not the way that Bitcoin chose to do things. They wanted the block-writing process to be done in a radically decentralised way, by servers competing against each other on a peer-to-peer network; they also chose to force these competing servers to solve tough puzzles that are computationally very expensive to process. Why?

Because intimately entangled in the way the Bitcoin ecology builds blocks is the way that new bitcoins are minted; at present the ‘reward’ from the system to a miner-machine for successfully solving the puzzle and making the latest block in the chain is a fee of 12.5 fresh new bitcoins, worth thousands of dollars at current exchange rates. That’s what motivates private companies to invest in mining hardware, and take part in the game.

This reward-for-work scheme is why the specialised computers that participate in the block-building competition are called ‘miners’.

Let’s assume that the miner has got as far through the process as verifying and bundling the transaction data, and has created the hash of the data for the header. At this point the Bitcoin system cooks up a mathematical puzzle based on the hash, which the ‘miner’ system making the block has to solve. These mathematical puzzles (and I cannot enlighten you more about their precise nature, it’s beyond me!) can be solved only by trial and error methods. Across the network, the competing miner servers are grinding away, trying trillions of possible answers, hashing the answers and comparing them to the header hash and the puzzle instructions to see if they’ve got a match.

This consumes a lot of computing power and energy – in 2014, one bitcoin ‘mining farm’ operator, Megabigpower in Washington state USA, estimated that it was costing 240 kilowatt-hours of electricity per bitcoin earned, the equivalent of 16 gallons of petrol. It’s doubtless gone up by now. The hashing power of the machines in the Bitcoin network has surpassed the combined might of the world’s 500 fastest supercomputers! (See ‘What is the Carbon Footprint of a Bitcoin?’ by Danny Bradbury: https://www.coindesk.com/carbon-footprint-bitcoin.

When a miner ‘thinks’ it has a correct solution, it broadcasts to the rest of the network and asks other servers to check the result (and, thanks to the hash-function check, though solving the problem is hard, checking the result is easy). All the servers that ‘approve’ the solution – strangely, it’s called a ‘nonce’ – will accept the proposed block, now timestamped and with a hash of the previous block’s header included to form the chainlink, and they update their local record of the blockchain accordingly. The successful miner is rewarded with a transaction that earns it a Block Reward, and I think collects some user transaction fees as well.

Because Bitcoin is decentralised, there’s always the possibility that servers will fall out of step, which can cause temporary forks and mismatches at the most recent end of the blockchain, across the network (‘loose ends’, you might call them). However, the way that each block links to the previous one, plus the timestamping, plus the rule that each node in the network must work with the longest extant version it can find, means that these discrepancies are self-repairing, and the data store is harmonised automatically even though there is no central enforcing agency.

The Bitcoin puzzle-allocation system dynamically adjusts the complexity of the puzzles so that they are being solved globally at a rate of about only six an hour. Thus, although there is a kind of ‘arms race’ between competing miners, running on ever faster competing platforms, the puzzles just keep on getting tougher and tougher to crack, and this is what controls the slow increase in the Bitcoin ‘money supply’. Added to this is a process by which the rate of reward for proof-of-work is being slowly decreased over time, which in theory should make bitcoins increasingly valuable, rewarding the people who own them and use them.

As I shall shortly explain, this computationally expensive ‘proof-of-work’ system is not a necessary feature of blockchain per se, and other blockchains use a less expensive ‘proof-of-stake’ system to allocate work.

Disentangling blockchain from Bitcoin

To sum up, in my opinion the essential characteristics of blockchain in general, rather than Bitcoin in particular, are as follows (and compare this with the Wikipedia extract quoted earlier):

  • A blockchain is a data structure that acts as a consultable ledger for recording sequences of facts, statuses, actions or transactions that occur over time. So it is not a database in the sense that a library catalogue is; still less could it be the contents of that library; but the lending records of that library could well be in blockchain form, because they are transactions over time.
  • New data, such as changes of status of persons or objects, are added by appending blocks of re-formed data; each block ‘points’ towards the previous one, and each block also gets a timestamp, so that together the blocks constitute a chain from oldest to newest.
  • The valuable data in the blocks are not necessarily encrypted (contrary to what some people say), so that with the right software, the record is open to inspection.
  • However, a fairly strong form of cryptographic hashing is applied to the data in each block, to generate a kind of internal digital certificate, which acts as a guarantee that the data has not become corrupted or maliciously altered. The hash string thus generated is recorded in the head of the block; and the whole head of the block will be hashed and embedded in the head of the following block, meaning that any alteration to a block can be detected.

And I believe we can set aside the following features which are peculiarities of Bitcoin:

  • The Bitcoin blockchain is a record of all the transactions that have ever taken place between all of the actors within the Bitcoin universe, which is why it is so giganormous (to coin a word). Blockchains that do not have to record value exchange transactions can be much smaller and non-global in scope – my personal medical record, for example, would need to journal only the experiences of one person.
  • All the data tracked by the Bitcoin blockchain has to live inside the blockchain; but blockchain systems can also be hybridised by having them store secure and verified links to other data repositories. And that’s a sensible design choice where the entire data bundle contains binary large objects (BLOBs) such as x-rays, scans of land title deeds, audio and video recordings, etc.
  • The wasteful and computationally expensive ‘proof of work’ test faced by Bitcoin miners is, to my mind, totally unnecessary outside of that kind of cryptocurrency system, and is a burden on the planet.

Marc shows a block

In closing his presentation, Marc displayed a slide image of the beginning of the record of block number 341669 inside the Bitcoin blockchain, from back in February 2015 when the ‘block reward’ for solving a ‘nonce’ was 25 Bitcoins. You can follow this link to examine the whole block on bitcoin.info: https://blockchain.info/block/0000000000000000062e8d7d9b7083ea45346d7f8c091164c313eeda2ce5db11. The PDF version of this article (see below) contains some screen captures of this online record.

That block carries records of 1,031 transactions, of a value of 1,084 BTC, and it is about 377 kB in size (and remember, these blocks add up!) The transaction record data can be clearly read, even thought it will not make much sense to human eyes because of the anonymisation provided by the encrypted user address of the sender, and the encrypted destination address for the receiver. Thus all we can see that ‘17p3BWzFeqh7DLELpodxt2crQjisvDbC95’ sent 50&nbsp;BTC to ‘1HEhEpnDhRMUEQSxSWeV3xBoxdSHjfMZJ5’.

Other cryptocurrencies, other blockchain methods

Bitcoin has had quite a few imitators; a July 2017 article by Joon Ian Wong listed nine other cryptocurrencies – Ethereum, Etherium Classic, Ripple, Litecoin, Dash, NEW, IOTA, Monero and EOS. (Others not mentioned include Namecoin, Primecoin, Nxt, BlackCoin and Peercoin.) That article also points to how unstable the exchange values of cryptocurrencies can be: in a seven-day period in July, several lost over 30% of their dollar values, and $7 billion of their market value was wiped out!

From our point of view, what’s interesting is a couple of variations in how alternative systems are organised. Several of these systems have ditched the ‘proof-of-work’ competition as a way of winning the right to make the next block, in favour of some variant of what’s called ‘proof-of-stake’.

As an example, consider Nxt, founded in late 2013 with a crowd-sourced donation campaign. A fixed ‘money’ supply of a billion NXT coins was then distributed, in proportion initially to the contributions made; from this point, trading began. Within the Nxt network, the right to ‘forge’ the next block in the transaction record chain is allocated partly on the basis of the amount of the currency a prospective ‘forger’ holds (that’s the Stake element), but also on the basis of a randomising process. Thus the task is allocated to a single machine, rather than being competed for; and without the puzzle-solving element, the amount of compute power and energy required is slight – the forging progess can even run on a smartphone! As for the rewards for ‘playing the game’ and forging the block, the successful block-forger gains the transaction fees.

Marc specifically mentioned Ethereum, founded in 2014–15, the currency of which is called the ‘ether’. In particular he referred to how Ethereum supports ‘Smart Contracts’, which are exchange mechanisms performed by instructions in a scripting language being executed on the Etherium Virtual Machine – not literally a machine, but a distributed computing platform that runs across the network of participating servers. Smart contracts have been explored by the bank UBS as a way of making automated payments to holders of ‘smart bonds’, and a project called The DAO tried to use the Etherium platform to crowd-fund venture capital. The scripts can execute conditionally – the Lighthouse project is a crowd-funding service that makes transfers from funders to projects only if the funding campaign target has been met.

Other uses of blockchain distributed ledgers

In October 2015, a feature article in The Economist pointed out that ‘the technology behind bitcoin lets people who do not know or trust each other build a dependable ledger. This has implications far beyond the cryptocurrency.’ One of the areas of application they highlighted was the secure registration of land rights and real-estate transactions, and a pioneer in this has been Lantmäteriet, Sweden’s Land Registry organisation.

Establishing a blockchain-based publicly inspectable record about the ownership (and transfer of ownership) of physical properties poses some different problems from those that simply transfer currency. The base records can include scans of signed contracts, digital photos, maps and similar objects. What Lantmäteriet aims to collect in the blockchain are what it dubs ‘fingerprints’ for these digital assets – SHA-256 hashes computed from the digital data. You cannot tell from a fingerprint what a person looks like, but it can still function as a form of identity verification. As a report on the project explains:

‘A purchasing contract for a real estate transaction that is scanned and becomes digital is an example. The hash that is created from the document is unique. For example, if a bank receives a purchasing contract sent via email, the bank can see that the document is correct. The bank takes the document and run the algorithm SHA-256 on the file. The bank can then compare the hash with the hash that is on the list of verification records, assuming that it is available to the bank. The bank can then trust that the document really is the original purchasing contract. If someone sends an incorrect contract, the hash will not match. Despite the fact that email has a low level of security, the bank can feel confident about the authenticity of the document.’ (‘The Land Registry in the blockchain’ — http://ica-it.org/pdf/Blockchain_Landregistry_Report.pdf)

In the UK, Her Majesty’s Land Registry has started a project called ‘Digital Street’ to investigate using blockchain to allow property ownership changes to close instantaneously. Greece, Georgia and Honduras have similar projects under way.

In Ghana, there is no reliable nationwide way of registering ownership of land and property, but a nonprofit project called Bitland is drawing up plans for a blockchain-verified process for land surveys, agreements and documentation, which – independent of government – will provide people with secure title (www.bitland.world). As they point out, inability to prove ownership of land is quite common across Africa, and this means that farmers cannot raise bank capital for development by putting up land as security.

Neocapita is a company that is developing Stoneblock as a decentralised blockchain-based registration service for any government-managed information, such as citizen records. They are working in collaboration with the United Nations Development Program, World Vision, and two governments (Afghanistan and Papua New Guinea), initially around providing a transparent record of aid contributions, and land registry.

Noeleen Schenk on blockchain and information governance

After Marc Stephenson had given his technical overview of Blockchain, Noeleen Schenk (also of Metataxis) addressed the issue of what these developments may mean for people who work with information and records management, especially where there are issues around governance.

Obviously there is great interest in blockchain in financial markets, securities and the like, but opportunities are also being spotted around securing the integrity of the supply chain and proving provenance. Walmart is working with IBM on a project that would reliably track foodstuffs, from source to shelf. The Bank of Canada is looking towards using blockchain methods to verify customer identities onwards, on the basis that the bank has already gone through identity checks when you opened your account. Someone in the audience pointed out that there are also lots of applications for verified records of identity in the developing world, and Noeleen mentioned that Microsoft and the UN are looking at methods to assist the approximately 150 million people who lack proof of identity.

Google DeepMind Health is looking at using some blockchain-related methods around electronic health records, in a concept called ‘Verifiable Data Audit‘, which would automatically record every interaction with patient data (changes, but also access). They argue that health data needn’t be as radically decentralised as in Bitcoin’s system – a federated structure would suffice – nor is proof-of-work an appropriate part of the blockmaking process in this context. The aim is to secure trust in the data record (though ironically, DeepMind was recently deemed to have handled 1.6 million Royal Free Hospital patient records inappropriately).

Noeleen referred to the ISO standard on records management, ISO 15489-1, which gives as the characteristics of ‘authoritative records’ – meeting standards for authenticity, reliability, integrity and usability. What has blockchain to offer here?

Well, where a blockchain is managed on a decentralised processing network, one advantage can be distributed processing power, and avoidance of the ‘single point of failure’ problem. The use of cryptographic hashes ensures that the data has not been tampered with, and where encryption is used, it helps secure data against unauthorised access in the first place.

Challenges to be solved

Looking critically at blockchain with an information manager’s eye, Noeleen noticed quite a few challenges, of which I highlight some:

  • Private blockchains are beginning to make their appearance in various sectors (the Walmart provenance application is a case in point). This raises questions of what happens when different information management systems need to interoperate.
  • In many information management applications, it is neither necessary nor desirable to have all of the information actually contained within the block (the Lantmäteriet system is a case in point). Bringing blockchain into the picture doesn’t make the problem of inter-relating datasets go away.
  • Blockchain technology will impact the processes by which information is handled, and people’s roles and responsibilities with that process. Centres of control may give way to radical decentralisation.
  • There will be legal and regulatory implications, especially where information management systems cross different jurisdictions.
  • Noeleen has noticed that where people gather (with great enthusiasm) to discuss what blockchain can do, there seems to be very poor awareness amongst them of well-established record-keeping theory, principles, and normal standards of practice. The techies are not thinking enough about information management requirements.

These issues require information professionals to engage with the IT folks, and advocate the incorporation of information and record-keeping principles into blockchain projects, and the application of information architectural rigour.

Intermediate discussion

Following Noeleen’s presentation, there were some points raised by the audience. One question was how, where the blockchain points to data held externally, that external data can itself be verified, and how it can be secured against inappropriate access.

Someone made the point that is is possible to set up a ‘crypotographic storage system’ in which the data is itself encrypted on the data server, using well established public–private key encryption methods, and therefore accessible only to those who have access to the appropriate key. As for the record in the blockchain, what that stores could be the data location, plus the cryptographic hash of the data, so that any tampering with the external data would be easy to detect.

What blockchain technology doesn’t protect against, is bad data quality to start with. I’m reminded of a recent case in which it emerged that a sloppy clinical coder had entered a code on a lady’s record, indicating that she had died of Sudden Infant Death Syndrome (happily, she was very much alive). That transaction can never be erased from the blockchain – but it doesn’t stop the record being corrected after.

John Sheridan: Blockchain and the Archive: the TNA experience

Our third presentation was from John Sheridan, the Digital Director at The National Archives (TNA), with the title ‘Application of Distributed Ledger Technology’. He promised to explain what kinds of issues the Archive worries about, and where they think blockchains (or distributed ledgers more generally) might help. On the digital side of TNA, they are now looking at three use-cases, which he would describe.

John remarked that the State gathers information ‘in order to make Society legible to it’ – so that it might govern. Perhaps The Domesday Book was one of the world’s first structured datasets, collected so that the Norman rulers might know who owned what across the nation, for taxation purposes. The Archive’s role, on the other hand, is to enable the citizen to see the State, and what the State has recorded, by perusing the record of government (subject to delays).

Much of the ethos of the TNA was set by Sir Hilary Jenkinson, of the Public Record Office (which merged with three other bodies to form TNA in 2003). He was a great contributor to archive theory, and in 1922 wrote A Manual of Archive Administration (text available in various formats from The Internet Archive, https://archive.org/details/manualofarchivea00jenkuoft). TNA still follows his attitude and ideas about how information is appraised and selected, how it is preserved, and what it means to make that information available.

An important part of TNA practice is the Archive Descriptive Inventory – a hierarchical organisation of descriptions for records, in which is captured something of the provenance of the information. ‘It’s sort of magnificent… it kind of works,’ he said, comparing it to a steam locomotive. But it’s not the best solution for the 21st century. It’s therefore rather paradoxical that TNA has been running a functional digital archive with a mindset set that is ‘paper all the way down’ – a straight line of inheritance from Jenkinson, using computers to simulate a paper record.

Towards a second-generation digital archive

It’s time, he said, to move to a second-generation approach to digital archive management; and research into disruptive new technologies is important in this.

For the physical archive, TNA practice has been more or less to keep everything that is passed to it. That stuff is already in a form that they can preserve (in a box), and that they can present (only eyes required, and maybe reading spectacles). But for the digital archive, they have to make decisions against a much more complex risk landscape; and with each generation of technological change, there is a change in the digital preservation risks. TNA is having to become far more active in making decisions about what evidences the future may want to have access to; and, which risks they will seek to mitigate, and which ones they won’t.

They have decided that one of the most important things TNA must do, is to provide evidence for purposes of trust – not only in the collection they end up with, but also in the choices that they have made in respect of that collection. Blockchain offers part of that solution, because it can ‘timestamp’ a hash of the digital archive asset (even if they can’t yet show it to the public), and thereby offer the public an assurance, when the archive data is finally released, that it hasn’t been altered in the meantime.

Some other aims TNA has in respect of the digital archive include: being more fluid about how an asset’s context is described; dealing with uncertainties in provenance, such as about when a record was created; and permitting a more sophisticated, perhaps graduated, form of public access, rather than just now-you-can’t-see-it, now-you-can. (They can’t simply dump everything on the Web – there are considerations of privacy, of the law of defamation, of intellectual property and more besides.)

The Archangel project

Archangel is a brand new project in which TNA is engaged together with the University of Surrey’s Centre for the Digital Economy and the Open Data Institute. It is one of seven projects that EPSRC is funding to look at different contexts of use for distributed ledger technology. Archangel is focused specifically on public digital archives, and the participants will try to work with a group of other memory institutions.

The Archangel project will not be using the blockchain methods that Marc had outlined. Apparently, they have their own distributed ledger technology (DLT), with ‘permissioned’ access.

The first use-case, which will occupy them for the first six months, will focus on a wide variety of types of research data held by universities: they want to see if they can produce sets of hashes for such data, such that at a later date, when the findings of the research are published and the data is potentially archived, any question of whether the data has been tampered with or manipulated can be dealt with by cryptographic assurance spread across a group of participating institutions. (The so-called ‘Climategate’ furore comes to mind.)

The second use-case is for a more complex kind of digital object. For example, TNA preserves the video record of proceedings of The Supreme Court. In raw form, one such digital video file could weigh in at over a terabyte! Digital video transcoding methods, including compression algorithms, are changing at a rapid pace, so that in a decade’s time it’s likely that the digital object provided to the public will have to have been converted to a different file format. How is it possible to create a crypographic hash for something so large? And is there some way of hashing not the bit sequence, but the informational content in the video?

It’s also fascinating to speculate about how machines in future might be able to interpret the informational content in a video. At the moment, a machine can’t interpret the meaning in someone’s facial expressions – but maybe in the future?

For this, they’ll be working with academics who specialise in digital signal processing. They are also starting to face similar questions with ‘digital surrogates’ – digital representations of an analogue object.

The third use-case is about Deep Time. Most people experimenting with blockchain have a relatively short timescale over which a record needs to be kept in verifiable form, but the aspirations of a national archive must looks to hundreds, maybe thousands of years.

Another important aspect of the Archangel project is the collaboration that is being sought between memory institutions, which might reach out to each other in a concerted effort to underscore trust in each others’ collections. On a world scale this is important because there are archives and collections at significant risk – in some places, for example, people will turn up with Kalashnikovs to destroy evidence of human rights abuses.

 

Discussions and some closing thoughts

Table group discussions: NetIKX meetings typically feature a ‘second half’, which is made up of table-group discussions or exercises (syndicate sessions), followed by a summing-up plenary discussion. However, the speakers had not organised any focused discussion topics, and certainly the group I was in had a fairly rambling discussion trying to get to grips with the complexity and novelty of the subject. Likewise, there was not much ‘meat’ that emerged in the ten minutes or so of summing up.

One suggestion from Rob Begley, who is doing some research into blockchain, was that we might benefit from reading Dave Birch’s thoughts on the topic – see his Web site at http://www.dgwbirch.com. However, it’s to be borne in mind that Birch comes at the topic from a background in electronic payments and transactions.

My own closing thoughts: There is a lot of excitement – one might say hype – around blockchain. As Noeleen put it, in the various events on blockchain she had attended, a common attitude seems to be ‘The answer is blockchain! Now, what was the problem?’ As she also wisely observed, the major focus seems to be on technology and cryptocurrency, and the principles of information and records management scarcely get a look-in.

The value of blockchain methods seem to centre chiefly on questions of trust, using a cryptographic hashing and a decentralised ledger system to create a hard-to-subvert time-stamped record of transactions between people. The transactional data could be about money (and there are those who suggest it is the way forward for extending banking services in the developing world); the application to land and property registration is also very promising.

Another possible application I’m interested in could be around ‘time banking’, a variant of alternative currency. For example in Japan, there is a scheme called ‘Fureai Kippu’ (the ‘caring relationship ticket’), which was founded in 1995 by the Sawayaka Welfare Foundation as a trading scheme in which the basic unit of account is an hour of service to an elderly person who needs help. Sometimes seniors help each other and earn credits that way, sometimes younger people work for credits and transfer them to elderly relatives who live elsewhere, and some people accumulate the credits themselves against a time in later life when they will need help. It strikes me that time-banking might be an interesting and useful application of blockchain – though Fureai Kippu seems to get on fine without it.

When it comes to information-management applications that are non-transactional, and which involve large volumes of data, a blockchain system itself cannot cope: the record would soon become impossibly huge. External data stores will be needed, to which a blockchain record must ‘point’. The hybrid direction being taken by Sweden’s Lantmäteriet, and by the Archangel project, seems more promising.

As for the event’s title ‘ The implications of Blockchain for KM and IM’ — my impression is that blockchain offers nothing to the craft of knowledge management, other than perhaps to curate information gathered in the process.

Some reading suggestions

Four industries blockchain will disrupt (https://www.researchandmarkets.com/blog/4-industries-blockchain-will-disrupt)

Two billion people lack access to a bank account. Here are 3 ways blockchain can help them (https://www.weforum.org/agenda/2017/06/3-ways-blockchain-can-accelerate-financial-inclusion)

TED talk, Don Tapscott on ‘how the blockchain is changing money and how the blockchain is changing money and business (https://www.ted.com/talks/don_tapscott_how_the_blockchain_is_changing_money_and_business)

Why Ethereum holds so much promise (http://uk.businessinsider.com/bitcoin-ethereum-price-2017-7)

Wikipedia also has many valuable articles about blockchain, cryptographic hashing, etc.

Note:

The original version of this article can be found at http://www.conradiator.com/kidmm/netikx-jul2017-blockchain.html. You can also download a pdf (9 pages; 569 kB): http://www.conradiator.com/kidmm/netikx-resource/NetIKX-blockchain.pdf.

Posted in Uncategorized | 1 Comment

Developing Effective Collaborative Knowledge Spaces

Conrad Taylor writes:

During 2017, which is a 20th anniversary year for NetIKX, a number of eminent speakers have been invited to lead meetings, speakers who for the most part have addressed NetIKX before. At the meeting on 18 May 2017 the speakers were Paul Corney and Victoria Ward.

Paul worked for 25 years in top management in the City of London financial sector (Saudi International Bank and Zurich Reinsurance), and for the last couple of decades has pursued a ‘portfolio career’ as a business adviser, facilitator and business coach, with clients in 24 different countries, including Iran, Saudi Arabia, the Gulf States and several African countries.

Paul is also a managing partner at the Sparknow consultancy, which Victoria Ward founded in 1997. Victoria’s background is similarly in knowledge management in the banking sector. Sparknow approaches organisational KM using narrative enquiry methods, and Victoria can list amongst her former clients, a number of banks, government agencies, museums and cultural organisations, the World Health Organisation and the British Council.

Recently, Victoria and Paul have been working with Clive Holtham of the Cass Business School on a project looking at how the arrangement of space impacts the working environment, and knowledge sharing within that. Paul has been conducting a kind of rolling survey across various locations around the world. We in NetIKX would be the latest to add our thoughts; and Paul intends to publish a report as the summation of this enquiry.

Points of view

Paul and Victoria set up an exercise in which the forty or so people present were clustered into three groups, out of earshot of each other. Each group was then quietly told what ‘profession’ we were to adopt as our collective point of view. We were to carefully make an assessment of the room we were in, from that assumed profession’s point of view, and list the positive, and difficult, characteristics of the room. Then each group, through a spokesperson, would tell the others about their list of good or bad room features – and the other groups were supposed to guess that group’s profession!

Group One commented that the room was very white and light; that there were lots of power points. They notes there was quite a lot of furniture, but the tables were on wheels and easily moved; there were lots of nooks and crannies, and lots of potential for mess around the coffee machines. We guessed that they were cleaners! Group Two mentioned the functional design of the room; the low ceiling and narrow form of the room; and lots of natural light from the windows. They were interior designers!

I was in Group Three and I think we had the most fun assignment. We talked about there being a couple of useful exits including a fire exit onto the roof (with presumably a way off that down to street level); various valuables conveniently next to the door, and some rather nice looking IT equipment; perhaps too many windows to be able to operate unseen, but no CCTV cameras. Yes, we were the thieves!

That was a nice, fun ice-breaker, but it was also more than that, as Victoria and Paul explained. Things (and not just rooms!) look different according to your point of view. They had come across this exercise used in a very large gathering at the Smithsonian Museum, and it’s especially useful to deploy at the start of a meeting when you want to draw attention to how a thing, or a situation, might look very different from somebody else’s perspective; something that’s good to bear in mind when there are many stakeholders.

Perspectives on Knowledge Management

KM, or Knowledge Management, has been described as ‘a discipline focused on ways that organisations create and use knowledge’. However, said Paul, beyond that there is no single accepted definition of what KM is, and it’s a field with no agreed global standards as yet.

Paul works around the use of knowledge within businesses. In his newly published book ‘Navigating the Minefield: a practical KM Companion’ he has suggested some characteristics which could define ‘a good KM programme’, such as it being in support of a business’s goals, and aligned with its culture. One focus will be operational, seeking to cut the costs of doing business (in money or time) – in practice, this is the focus of four out of five KM projects in business. Some projects look in more strategic directions, towards innovation and future business benefit.

One paradox of KM is that many of the people who practice it,do not stay long term with their employers, but move on every few years to a new appointment. This can lead to the pursuit of short-term goals and ‘fighting fires’ rather than more strategic approaches.

How can you effectively transfer knowledge from an expert, to a wider community? One positive story Paul shared was of work he did with Cláudia Bandeira de Lima, a leading authority on childhood autism and language development in the Portuguese-speaking world. The solution they devised was to run a foundation programme in the methodology, PIPA (Programa Integrado Para o Autismo), teaching courses and accrediting practitioners.

To represent another aspect of KM, at the personal level, Paul used an image of a laptop. If it is stolen or breaks down, you can replace the hardware and the applications, but if you haven’t backed up the documents which constitute your knowledge resources, ‘you’re toast!’ In doing knowledge audits, he and Veronica often found that sloppy attitudes to managing digital knowledge resources were rampant. An American survey from a few years ago estimated that a typical cost to replace someone in a senior business position is in the region of $400,000 – because when the previous incumbent moved on, they took their knowledge with them, and nobody had done anything to ‘back it up’.

Drivers and definitions

What is driving this thing called ‘knowledge management’? Why do people do it? To Paul it seems that a major driver within many businesses is compliance with regulations; and in a couple of years, when ISO standards for knowledge management appear, it will likely be about compliance with those standards as well. ‘Already today, if you want to sell a locomotive, one of the criteria is that you engage in knowledge management, and are seen to do so in a very professional way,’ explained Paul.

A second driver is around innovation and process efficiency; people believe there is benefit to doing things better with what you have. And a third driver is the management of risk. And then, in some organisations at any rate, there are concerns about using KM to support governance, strategy and vision.

Paul used a simple ‘three pillars’ diagram to represent the above scheme, but his next diagram, giving some examples of motivators/drivers for KM in the real world, was more complicated and so we reproduce it here as an image, with his permission. He represented five different industrial sectors as examples: nuclear power, the regulatory sector, government, industry and the services sector.

In the nuclear industry, a key driver is planning for the complex process of decommissioning power plants at the end of their lives. Companies anticipate that when that time comes, they will be downsizing, and at the same time losing people with maybe 40 years of nuclear operations and decommissioning.

In the regulatory industry (as Paul and Victoria found through interviews in Canada some years back), a large problem is around succession planning as people at the top retire. This is similar to the driver for Shell’s ‘ROCK’ programme (Retention of Critical Knowledge), which they called it ‘The Great Crew Change’.

In government, ‘flexible working’ has been invoked as a mantra. As Paul and Victoria discovered in interviews at the Department of Justice, a possible effect of this is the diffusion of specialist knowledge, as working becomes more generic. But if this can be managed, services can be improved.

Enhancing manufacturing processes is a key driver for industry. Paul described a recent three-year project he ran for Iran’s largest company, which aimed to shorten the time it took from coming up with an idea, to bringing it to market.

In the services sector, including finance and legal work, Paul said that the key to business efficiency is the effective re-use of precedent; it is in this sector that ‘artificial intelligence’ is likely to have the greatest impact.

At this point, Jonathan from Horniman Museum said that he could identify with all those drivers; but in addition, their raison d’être at the Museum is the curation and transfer of knowledge to the general public. Victoria responded that she’d done work about ten years ago for the Museum Documentation Association, funded by the London Development Agency, looking at what museums contribute to the knowledge economy of London. (The MDA shortly relaunched itself as the Collections Trust.) Two things which she remembers well from that project, which were not represented by Paul’s diagram, were:

As work gets more ‘nomadic’ and fluid, workers in various industries need somewhere they can think of as an intellectual ‘home’; for fashion, it would be the V&A. But when that MDA study was conducted, it seemed that museums were overlooking their rôle in relation to certain professional knowledge networks.

Knowledge Transfer Officers can play a vital rôle as a ‘cog’ or enabling connector, between the more entrepreneurial innovators in the organisation and those whose instincts are more curatorial and conservative; between ‘fast cultures’ and ‘slow cultures’, if you like.

Co-working hubs

Costs as a driver

Paul referred to a 2013 UK government report on Civil Service reform, authored by Andy Lake of Flexibility.co.uk and called The Way We Work: a guide to smart working environments [http://www.flexibility.co.uk/downloads/TW3-Guide-to-SmartWorking-withcasestudies-5mb.pdf]. This pointed out that the costs of providing working environments, both financial and environmental, can be reduced by switching away from dedicated desks and PCs, to co-working hubs.

Paul hasn’t worked in an office for 20 years – his ‘office’ is just wherever he finds himself with his Mac and his ’phone and other devices. Sparknow had an office for about five years, but the team decided it wasn’t necessary as long as people were disciplined in their collaboration practices. An executive recruitment firm in the USA has offered the opinion that perhaps by 2020, 40% of people will be mobile workers (I presume they refer to office jobs only), and that they will be freelancers. The benefits will be lower operating costs and higher productivity..

With Prof Clive Holtham, Paul has been advancing the view that as these developments occur, organisations will have to ensure that working environments – be they physical like co-working hubs, or virtual like arrangements for remote working – will be conducive for effective Knowledge Management. (This is what we were going to be looking at for the rest of the day.)

Finally, there was about 20 minutes left. Rather than having some kind of delegated report-back from the table groups, as many might do, Ron improvised another Gurteen-like feature: he asked the groups to reflect on the process they had experienced and what they had learned from it; then half way through, asked half the people at each table to move to the next table and continue the same discussion.

Victoria Ward noted that when they first started doing Knowledge Audits, people never included looking at their ‘knowledge spaces’. They would look at their networks, their disciplines, but it always surprised the clients when they were asked how the physical workspace functioned. When asked to conduct a Knowledge Audit, they now ask to take a look at such spaces, and ask questions about how they are supported.

Good and bad knowledge spaces

‘Did you know that the average desk is occupied for only 45% of office hours?’ asked Paul. That’s what Will Hutton noted for the Work Foundation in 2002, in a report for the Industrial Society. The foundation claimed that the workplace (the office workspace, that is – not fields and factories, shops and warehouses) was being reinvented as ‘an arena for ideas exchange’ and a drop-in workspace for mobile workers: a place where professional and social interaction can occur. And the foundation noted that workspaces which are badly designed or badly managed can actually damage the physical and mental wellbeing of staff.

The firm of Ove Arup believes that the future of (office) workspace will be a network of locations – many of them on short leases or even pay-as-you-go, shared spaces rather than highly ‘territorial’ ones. Also, they believe there will be a corresponding flexibility in working interactions, operating across both physical and virtual environments.

The Edge. In January 2017, Paul was helping to run some events around the KM Legal conference in Amsterdam. At a Smart Working summit in 2016, Paul had heard of an amazing office building in Amsterdam called ‘The Edge’, so on this trip he made a visit to the place, and was shown around by the architect and the building manager. The building’s developer was OVG Real Estate and the design was by London-based PLP Architecture. The building’s main tenant is the consulting firm, Deloitte. There is a video about the place on YouTube – at https://youtu.be/JSzko-K7dzo – and Paul showed it to us. (There is also a Bloomberg article at https://www.bloomberg.com/features/2015-the-edge-the-worlds-greenest-building/)

The video claims that The Edge is ‘certifiably the greenest building in the world’, with its extensive use of natural light, and harvesting of solar power (the building is a net producer, not consumer, of electricity). Heat pumps circulate water through an insulated aquifer over a hundred metres below, to warm the building in winter and cool it in summer. From the viewpoint of our meeting topic, however, what is significant is how it is structured as a place for a new way of productive working, what the Dutch call het nieuwe werken.

Nobody gets a desk of their own at The Edge; Deloitte’s 2,500 workers there share 1,000 ‘hot desk’ locations, and can also access tiny cubicles or shared meeting facilities, some with massive flat screens which sync with laptops or mobiles. Workspaces are assigned to you according to your schedule for the day, and your ‘home base’ is any locker which you can find empty for the day.

Access to these facilities is driven by a smartphone app used by every worker, and a system which notes everyone’s location and needs and preferences and adjusts the local environment based on your preferences; this is supported by a distributed network of 28,000 sensors.

Paul also commented that people do really want to come to work at The Edge – that’s been a driver of recruitment, there is little absenteeism, and it is somewhere clients want to visit too.  Another thing that users of the building repeatedly praise is the use of natural daylight, which supplies 80% of lighting needs (including through a huge central covered atrium).

Ellipsis Media is a successful content management company, which started above a toy shop in Croydon. They used to have meetings around a particular table in the pub opposite, and as they grew into new premises, they bought that table and installed it as their own little bit of history. Paul mentioned other instances of companies (HSBC, Standard Chartered) using their office space to curate their history – the history of their internal community and its journey.

BMS. Paul also described his engagement with the world’s largest reinsurance broker, BMS, which used the opportunity of their move to One America Square near London Fenchurch Street station. The move brought 13 different federated business units into one shared location. As part of the move, BMS created collaborative physical spaces, including a meetings hub called ‘Connexions’ and an adjacent business lounge with the very best coffee, subsidised snacks and high-speed mobile Internet access. This had a great effect in helping people to break out of the silos of the formerly isolated business units (see Paul’s account of BMS’s journey at http://www.knowledgeetal.com/?p=465).

KHDA. During 2016, on his way back from Iran, Paul went to see friends in Dubai. The Dubai Knowledge and Human Development Authority (KHDA) manages secondary and higher education in Dubai. He showed us pictures of their open-plan workspace – you’ll often see the Chief Executive sitting there. It’s a very informal place – Paul even had a budgerigar fly past his head!

Asian Development Bank. Victoria and Paul worked together (as Sparknow) in Manila, on a project for the Asian Development Bank. ADB’s shared atrium space at the time included a touchcreen with a huge Google Earth display. Victoria added that ADB had long had a traditionally styled library, but had remodelled it, moving the bookshelves to the edge and creating an open central space. The Google map was put there, and used as an ‘attractor’ to cause people to slow down and encounter each other, to cut across the boundaries in the organisation. ADB used the space for a number of knowledge-sharing events, including ‘Inside Thursdays’.

ADB got Paul and Victoria to run a three-day workshop in that space, exploring the use of narrative in the ADB. They were able to construct a temporary collaborative knowledge space, with a long timeline laid out over connected tables, and workstations at which participants could mark out a map of the ADB’s history, and their hopes for its future – and to identify where interviews should be conducted with the oral history practitioners, and what kinds of questions should be asked.

That event was very memorable for its visual components, too. That pop-up knowledge space, and the shared creation of the timeline and other artefacts, created a useful and engaging memory for people when they then looked later at the products of the knowledge work.

ADB has published a paper about this in 2010, called ‘Reflections and Beyond’ (184 pages) which can be retrieved as a PDF from http://reflections.adb.org/wp-content/uploads/2014/08/adb-reflections-and-beyond.pdf. There is also a concise Sparknow narrative about the project at http://www.sparknow.net/publications/ADB_Reflections_Beyond_method.pdf.

Exercise set-up: the Knowledge Space Survey

Before we took our refreshment break, Paul gave a little background to a rolling project he has been co-ordinating, called the ‘Collaborative Knowledge Space Survey’. This qualitative enquiry had already gathered contributions, some by email, and some at events such as at a Masterclass he ran in March at the International Islamic University of Malaysia in Kuala Lumpur. Now it would be NetIKX participants’ chance to contribute!

Paul’s collaborators in collating and reviewing the results are Prof Clive Holtham and Ningyi Jiang at Cass Business School.

Ron followed this observation by some stories about how the COPD patients have been benefitting from the drop-in sessions, and how much they valued it.

To capture people’s ideas about ‘knowledge spaces’ at work (both physical and virtual ones), the survey has ten set questions, but the answers could be open-ended, in textual and often narrative form. There were certainly no multiple-choice answer mechanisms.

Around the walls of the room in which we were meeting, nine posters had been set out, each one with one of the survey questions (except the first question, ‘Which continent do you work in?’), and the space below left blank in readiness for our contributions. Paul asked us to peruse the questions during the break, and choose which of them we would personally like to work with. The nine remaining questions were:

  • Question 2 — Where do you have your most interesting work conversations and do your best work?
  • Question 3 — Do you think your own workspace encourages collaboration? Tell us about a recent incident where this happened and who was involved.
  • Question 4 — Are there any parts of your building or workspace which you associate with memorable moments of work? Tell us about a time and place when this happened.
  • Question 5 — How does where you work reflect the way you work?
  • Question 6 — Have you ever witnessed a company change its physical workspace radically? What happened?
  • Question 7 — What you you understand by the term ‘digital workspace’?
  • Question 8 — In your experience, can you now replace physical workspace with a digital workspace? If so, how? If not, why not?
  • Question 9 — Does your organisation have a workspace strategy, and if so does it include a digital workspace? Please tell us about it.
  • Question 10 — ‘Any Device, Any Time, Anywhere’ is how one organisation now defines its approach to remote working. Looking forward to 2020, what changes do you foresee in the way you work and the devices you will be using?

Each of us should gravitate towards the question that interests us most, and an ideal group size would be 4–6 people. Grouped around our question of choice, we should consider, is there a pattern or theme that we might use in a checklist? And what keywords might we use to ‘tag’ the responses which we chose?

The exercise process

The way our NetIKX group approached the Collaborative Knowledge Space Survey is not the only way it can be done. For a start, the way we assigned ourselves to particular questions meant that by and large each person contributed to thinking about only one of the nine questions – even though Paul declared the Open Space ‘law of two legs’, and we could have moved from one group to another. But the separate group discussions went well in the 30–40 minutes available.

Nobody was attracted to Question 4, and for obvious reasons Question 1 was off the table. Thus we collected reactions to eight out of the ten survey questions. It is decades since I had anything like a ‘regular job’ and worked in a workplace, so I chose to work in the group clustered around Question 8.

After we had filled our posters, Paul prompted each group in turn to share thinking with the rest of the room. You can see the posters themselves which Paul afterwards embedded as images within the slide set, and accompanying this blog post. I also took my recording gear with me around the room to capture what people said in more detail.

Q2: Where do you have your most interesting work conversations, and where do you do your best work? — This group discussed the value of having both quiet places and busy places. Melanie described the Hub at DWP, which is a large area with a coffee bar and lots of different tables. On the poster, the group had noted that humour and banter, for example around the kitchen, brings people together and leave you feeling motivated. When you’re on a journey, on a train, even just walking between places, this has value in freeing up ‘internal conversations – you often need silence ‘so you can hear yourself think’.

The keywords the group chose were human – flexible – adaptable – informal – fun – balance (between external and internal conversation, and between physical and digital) – mindfulness. Emma added that the most interesting and significant conversations are usually in an informal setting, and are often serendipitous.

Q3: Do you think your own work space encourages collaboration? — This group had started by comparing their own workspace experiences. Lissi referred to her ‘collaboration cocktail’ of spaces, ranging from attending NetIKX meetings to sitting up in bed to do her work. Victoria Ward had a range of spaces and reported positively on ‘Slack’ (slack.com) , a cloud-based service which describes itself as ‘real-time messaging, archiving and search for modern teams’ (it’s an acronymn for ‘Searchable Log of All Conversation and Knowledge’!)

Graham Robertson works largely alone and his workspace is a room with no windows. ‘Radical uncubicalisation’ was a phrase that came up from two organisations that are trying to draw people out of their cubicles. Edmund Lee (Historic England) said that when people get a taste of this, they love it, but you need other kinds of constraint in place to make things happen.

Collaboration, said someone, involves interaction between human and human, also human and information. Information has its own kind of structure around the workplace; but humans, it must be remembered, have other goals in life, even when they are at work: getting on with people, getting something to eat, whatever.

Q5: How does where you work reflect how you work? — Naomi Lees (DWP) said that the culture that you work in reflects the physical aspects of where you work. For example Ayo Onotola is a librarian who works in a prison. (It is a compulsory requirement for every prison to have a library as part of the process for the reform and rehabilitation of the inmates.) He said that it may surprise people to know that quite a proportion of the prisoners are illiterate; and many don’t have English as their first language; so the prison runs a number of educational programmes for them. The library is key to that.

But – when you work in a prison library, it is a bit like being a prisoner yourself! When there is some trouble in the prison, there may be a general lockdown, then nobody comes to the library all day. Prisoners’ behaviour in the library is quite different from how they behave in their cells; ‘they see the library as a cool place to come and chill out’, Ayo said. And they are keen to collect books to take back to their cells. (On the poster, there was a note that recently the library has been moved to the canteen space, and is now getting more use.)

David Penfold’s example was the university, which has many different possible work environments and people – staff and students both – move between them and find those that are most conducive to what they want to do right now. And people also do much of their work from home.

Q6 — Have you ever witnessed a company change its physical workspace radically? What happened? — ‘Hot desking’ inevitably came up within this group; one person spoke of a transformation to open plan, hot desking and a clear-desk policy, including senior managers. Yes, there was resistance to this at first, but people have come to realise how working together in this way has encouraged the sharing of ideas quite naturally through conversation. It has to be said that the facilities provided were very good. Prior discussion with the users had raised the need for spaces for private conversation, and they had been provided. There are also ‘meeting pods’ set in the middle of the canteen area.

Good space design is crucial, said another person. Consultations with staff in advance is the key. When he worked at the Department of Energy and Climate Change, they had discovered that often there were serendipitous meetings in lifts that then continued to an adjacent space to continue. In another job, at a research institute, staff had been worried about the place becoming too noisy for concentration; this was met by setting up booths with acoustic shielding, for study or for private conversation.

Canteen spaces are particularly ripe for creative use, and that goes well with a culture that encourages people to take lunch away from their desks. (I remember that when I was doing a series of training workshops at Enterprise Oil, their staff canteen provided lovely food free of charge, which was certainly a motivator in that direction!)

Q7: What do you understand by the term ‘digital workspace’? — This group understood a digital workspace to be something that was open and without boundaries, both boundaries of physical space and time, able to operate 24 hours a day, seven days a week. They noted that this requires broadband that is fast enough. It should enable you to do what you would do in a ‘normal’ or ‘standard’ workspace, but allows for collaboration and the sharing of information.

Paul referred to work he had done in Africa, where there is usually very poor access to the Internet. But people adapted to that by communicating via WhatsApp – short, asynchronous conversations that can be picked up again after a communications breakdown.

Q8: In your experience, can you now replace a physical workspace with a digital workspace? If so, how? If not, why not? — This was the group I was in, with Edmund Lee, and the first thing we decided was that the afternoon’s conversation had had an unspoken bias towards office-type work. If you are a plumber, a farmer or construction worker, a social worker or a surgeon, a shop assistant or other front-line customer service worker, what you can achieve in a digital workspace will be strictly limited. So no, you cannot replace physical workspaces with digital ones, except in some narrowly defined fields.

For most of those at today’s event, a so-called digital workspace can substitute for many aspects of the physical workspace, but that is dependent on how good a digital surrogate you can create to replace the physically of that with which you work. Edmund works with archaeological excavations, and he noted that before you can consider implementing a digital workspace for such work, you have to find a way to make a digital surrogate of the things you work with. An example would be an expert in Roman pottery who has access to the physical artefacts, but nothing more than a digital representation of the site where they were found.

Another issue is the functionality and ‘affordances’ of the digital tools available. There are bandwidth and infrastructual constraints, and there are human factors. When conversations take place over a digital medium, can they convey body language? Paul agreed that is a huge issue, and he had just been running a workshop with Chris Collison on improving work in virtual teams and communities. [Note that Chris Collison is the speaker at the September NetIKX meeting.]

There are also new skill requirements and support issues. Edmund told of how their IT department had installed large digital whiteboards in the main meeting rooms, but didn’t tell anybody how to use them. So, the technology worked for the IT department but for nobody else!

Q9: Does your organisation have a workspace strategy and if so, does it include digital workspace? — This did not result in a poster, but Malcolm Weston reported on their successes at Aecom, which now has grown (by a process of acquisition and amalgamation) to 187,000 employees across 150 countries. That process required workspaces to be brought together, and collaboration between the business units to be enhanced. And so Aecom did set out a formal workplace strategy, to be implemented in every office worldwide.

The implementation in London involved internal change management consultants, and interior design consultants, talking to different teams within the organisation asking them what they liked about their current workspace environment, and what they didn’t like, and what they wanted changed. The new environment was created in the offices in Aldgate Tower.

Aecom staff now work in ‘neighbourhoods’, with the colleagues in their team, though not always at the same desk. Teams which would naturally want to collaborate on delivering work are situated adjacent to each other. There are internal staircases between four of the floors, with open plan breakout areas around them all. There are also small ‘walled sofas’ suitable for taking part in a conference call, meeting rooms which can be booked via a phone app, through to a small lecture theatre. No-one has a fixed computer; everyone has a mobile phone and a laptop; there is secure WiFi. You can also work from home via the VPN.

One of the drivers was to improve customer satisfaction; another was to avoid costly redesign by getting things right through collaboration, first time around. It also meshed with Aecom’s collaborative selling initiative; clients like to come and have meetings at Aecom’s place.

Q.10: Looking forward to 2020, what changes do you foresee in the way you work and the devices you will be using? — This team described a Lloyds Bank demonstration of virtual presence, using a VR headset. They thought one of the challenges of the future might be the emphasis on self-service, and the variety of devices, and perhaps a decentralisation of information storage. The team chose ‘change’ and ‘disruption’ and ‘managing complexity’ as key phrases.

Steve Dale spoke of having recently finished some work for an international organisation spread across thirty countries. Their policy is ‘extreme BYOD’ (bring your own device – no rules at all about what equipment or software to use. They did an audit, and discovered sixty different systems in use. And there were concomitant problems – a lack of collaboration across teams, and how the hell do you find Stuff? They did interviews between stakeholders, and discovered a split between people who like this freedom, and others who flounder in this lack of structure (particularly people newly joining the organisation).

Wrapping up

Paul skipped a number of his slides, which review the survey responses from Lisbon, Kuala Lumpur etc. A couple of slides also pulled out some of the insights which are beginning to emerge in the analysis Paul is doing with Clive Holtham and Ningyi Jiang at Cass Business School.

Paul referred to a meeting he and Victoria had recently had with Neil Usher at BSkyB. Neil has a twelve-point checklist: Daylight – Connectivity – Space – Choice – Control – Comfort – Refresh – Influence – Storage – Colour – Wash – Inclusion. Paul didn’t have time to unpack what these mean; apparently they are explained on Neil’s blog (http://workessence.com/). The two most important aspects, according to Neil, are natural daylight, and giving people choices.

Paul finished the afternoon workshop drawing attention to some closing slides which give contact details for himself and for Victoria.

Paul J Corney – paul.corney[at]knowledgeetal

On Twitter: pauljcorney

On Skype: corneyp

On mobile: +44 (0) 777 608 5857

Victoria Ward — victoria.ward[at]sparknow.net

A personal thought on ‘digital’ vs ‘virtual’

Paul and Victoria contrasted physical spaces where people meet and converse, and digital ones. I would prefer a contrast between physical and virtual spaces. My reason is, that I wish to give the nod to old traditions of knowledge sharing, which used correspondence and publication — instances of which are the so-called Invisible College. Collaboration minus face-to-face did not need an electronic medium to get started; it required shared language, writing, and a means of sending messages.

Posted in Uncategorized | Leave a comment

Gurteen Knowledge Café – 16 March 2017

Conrad Taylor writes:

In 2017, its tenth anniversary year, NetIKX is running a series of meetings with speakers who have spoken to us before. In March we invited David Gurteen to speak around the topic of ‘entrained and entrenched thinking’, and other constraints on knowledge sharing – and, what we can do about it. Specifically we wanted him to run one of his Knowledge Café events for us, in part because that process incorporates features designed to widen the scope of conversation and the consideration of diverse points of view.

As usual, these notes are constructed from my personal perspective.

About entrenched and entrained thinking

‘Entrenched’ thinking is something we pretty much understand. It’s when people refuse to consider the validity of any idea but their own, and it is often encountered in groups that see themselves as actively in opposition to another group. They are ‘dug in’ and refuse to budge. We’ve seen a lot of that in politics in the last year, but it occurs in all sorts of social and business environments too.

The phrase ‘entrained thinking’ is less familiar. It may have been coined by Dave Snowden and Mary Boone in their article in Harvard Business Review in 2007, where they define it as ‘a conditioned response that occurs when people are blinded to new ways of thinking by the perspectives they acquired through past experience, training, and success’. They note that both leaders and experts can fall into entrained thinking habits, which cause them to ignore both insights from alternative perspectives and those offered by people whose opinions they have come to disregard as irrelevant.

Evolutionary biology suggests reasons why falling back on available quick-and-dirty patterns of thinking (heuristics) has survival advantages over thinking everything through carefully from every conceivable angle; as Dave Snowden says, when you come across a lion in savannah country, it’s best not to analyse the situation too thoroughly before legging it up a tree. In his book Administrative Behavior (1947), Herbert Simon referred to such just-good-enough thinking as satisficing, and a study of the nature and role of heuristics in decision making was also central to Amos Tversky’s and Daniel Kahneman’s argument in Judgement Under Uncertainty: Heuristics and Biases (1982), which also introduced the concept of cognitive bias – a concept to which David Gurteen made reference.

However, there are times and situations in which it is good to cast a wider net for alternative ideas, which may turn out to be better than the established, so-called ‘tried and tested’ ones. The technique of brainstorming was pioneered in the field of advertising by Alex Osborn in 1939, and Edward de Bono introduced the concept of lateral thinking in 1967, following that with a veritable spate of books on creative thinking techniques.  (Note: The brainstorming process has been brought into question recently; see http://www.newyorker.com/magazine/2012/01/30/groupthink)

In this seminar and Knowledge Café workshop, David Gurteen focused on those blockages to ideas production and sharing that can occur in meetings and group conversations, and the actual practice of his Café technique shows some ways this can be done. So let’s get to understand the Café process, then move on to how David introduced our session, and close with a brief report of what came up in the closing in-the-round plenary session.

Introducing the Café process

David’s Knowledge Café process is a version of the World Café technique first devised in the mid 1990s by Dr Juanita Brown and David M Isaacs, an American couple who work with organisations to unleash ‘collective intelligence’ through ‘conversations that matter’. (See http://www.theworldcafe.com/) These techniques have been used by industrial companies, such as Hewlett-Packard and Aramco, and by local governments and non-profits in the context of social and economic development and community relations.

David says that he adopted the format as an antidote to ‘Death by PowerPoint’. He started running his Knowledge Café series in September 2002, in the Strand Palace Hotel. A number of NetIKX members have taken part in free London Knowledge Café events, which David facilitates about six times a year. More information can be found on his knowledge café website http://knowledge.cafe/

David has also run such sessions professionally for organisations across Europe, Asia and the Americas. They seem to work well in a wide range of cultural settings – even, he said, in those Asian cultures in which people often defer to authority. In a small group, it is easier to speak up about what you think, though as an organiser of such an event you may need to ensure that the groups are made up of equals.

The essence of the Café technique is to enable discussion around an open-ended question. Participants are divided into groups of three, four or at most five people, sat around tables (note: this is smaller than the typical size of a table group at a NetIKX workshop). In David’s events, the starting question is framed by having a speaker make a very short initial presentation of the topic – typically ending with the posing of such a question.

After the discussion around tables has gone on for some time, generally 15 minutes, the facilitator asks the table groupings to break up and re-form – for example, two people might stay on the same table while two move off to join other tables. After another 15 minutes’ conversation, the table groups are once again re-organised for a third round. David never uses more than three rounds of conversation in his own practice. The general aim of such Café techniques is to help people to accumulate a cascade of perspectives, and to widen their thinking.

There are variations on this theme. One World Café practice is to put a large sheet of paper on each table and encourage people to jot down ideas or doodle pictures during their conversations, so that the next group gets a sense of what has gone before. Another version appoints a ‘table host’ who stays with the table, relays ideas from the previous round, and encourages the new group to add ideas and perspectives to what has gone before. Such a person might also act as a rapporteur in a closing plenary session.

David’s practice dispenses with table-level facilitators (and doodle pads and rapporteurs), which makes a Gurteen Café easier to organise. The accumulation of perspectives tends to happen anyway, as people tend to share, with their new group, the ideas that came up in the previous one.

In David’s version of the Café, he said, there is no reporting back. The café’s outcomes are about what each individual takes away in his or her head – and that will be different for each person. As Theodore Zeldin says, the best conversations are those from which you emerge as a slightly different person.

However, David later qualified that by mentioning circumstances in which gathering and reporting the ideas that surface can be very valuable as a contribution to a problem-solving process – for a company or project, for example. His own general events tend to end these days with a closing session bringing everyone together in a circle for freeform sharing of ideas ‘in the round’ – space permitting. We did this at the NetIKX Café.

David explained a few guiding principles. The Café is about dialogue, not debate – it’s not about winning arguments, but nor is it about seeking consensus. The idea is simply to bring ideas and issues to the surface. And it is OK to drift off-topic.

Asked whether it is different to run a Café session inside a particular organisation, David responded that he’s found that the format can be used for brainstorming or as an adjunct to problem-solving; in that case, one should start in advance by defining the objective, and design the process accordingly. For such gatherings, you probably do want to include a method for capturing the ideas that arise. But any such capture mechanism must not get in the way of the conversation – giving one person a flipchart and a pen will put them in charge and distort the free exchange of ideas.

Our meeting topic

David explained that in his short pre-Café presentation he would touch on some challenges that we need to overcome in order to make better decisions, and to be more creative in coming up with ideas. In the café process we would then discuss how we might mitigate against these challenges.

Cognitive bias. David recommended that we take a look at the Wikipedia article about ‘Cognitive bias’. That in turn links to a page ‘List of cognitive biases’ — something like 200 of them, although it has been argued they can be grouped into four broader categories, arising either from too much information, not enough meaning, the need to act/decide quickly, and limits of memory. One of the ideas that has made it into common parlance recently is ‘confirmation bias’ – we tend to pay heed to ideas that reinforce our existing views.

Entrained thinking. This seems to be a relatively new idea, put forward by Dave Snowden and Mary Boone as described above. The idea is that we are conditioned towards certain ways of thinking, and it can be because of our education and training. We are also influenced by our culture, the environment in which we have grown up, and our experiences. These influences are so subtle and ingrained that we are probably not aware of them.

David asked me (Conrad) if I see things the same way. I replied that I do – but that although ‘entrained thinking’ appears to be a new term, it isn’t really a new idea. When I was studying the History of Science at Glasgow University, an influential book was Thomas Kuhn’s The Structure of Scientific Revolutions (1962) – the book that introduced the phrase ‘paradigm shift’ to the English language. Kuhn argued that scientific progress was not, as generally assumed, a matter of development by accumulation of facts and theories, but more episodic, involving the overthrow of previously prevailing ways of seeing the world. And until the new paradigm prevails, the old one will have its deeply entrained defenders.

One example that Kuhn analysed at length was ‘the Copernican revolution’, which argued that the earth orbits the sun, rather than the other way around. Darwin’s theory of evolution also met with strong opposition from people invested in a creationist narrative and Biblical timescale for earth’s existence, and more recently the theory of plate tectonics and continental drift was resisted and mocked until the 1960s – yet it is now one of the ground truths of geological science. So Kuhn’s idea of a ‘paradigm’ – as a way of thinking that one considers normal and natural (but may later be replaced by a better one) – does carry in it a notion similar to ‘entrained thinking’.

Entrenched opinions. People may be resistant to taking new ideas on board – they take an entrenched position. Such people are not prepared to listen; they ‘know they are right’ and refuse to consider an alternative interpretation. In this case people may be very conscious of their views, which are closely bound up with their sense of themselves.

‘Speaking truth to power’ is a phrase that we hear a lot – it could mean not being afraid to say something to your boss, even though the consequences for you could be dire. The phrase recognises that power relations influence whether we choose to express our thoughts and views openly.

Loss of face. If you’ve always got to look good, it’s very difficult to speak up.

The Spiral of Silence – also called ‘social silencing’ – is an idea David encountered only recently. It’s a concept in political science and mass communication theory put forward by Elisabeth Noelle-Neumann, who argues that fear of isolation, of being excluded or rejected by society because you hold a minority opinion, may stop you from speaking out. She also argues that the media not only shape what ideas are the dominant ones, but also what people perceive to be the dominant ideas, even though that perception may not accord with reality. (Much of the mainstream media is telling us that the British public are united in a determination to leave the EU, for example.)

A related critique of social media – Facebook, for example – is that it encourages people to live in bubbles of confirmation bias, connecting us to people who share the same ideas as ourselves.

Groupthink is a well known term. Perhaps people in a meeting do all think the same way – or is there a sizeable group who think differently and just don’t want to rock the boat?

Last on David’s list was facilitator bias – was he, for example, in a position to bias our thinking?

The questions for the Café

So here were a few barriers that can get in the way of a good conversation, and thus impoverish group creativity and problem solving. David invited us to go into Café mode and talk about how to overcome these problems.

In the promotional text for this meeting, we had asked three questions, and David suggested that perhaps each ‘round’ of the Café might look at these.

  • The first question is, what factors in people’s backgrounds, professional education and culture, lead to them having a ‘blinkered’ view of the range of available opinions and policy decisions, especially at work? How might this be mitigated?
  • Second, when we meet together in groups to decide something in common, to come to a practical decision, what meeting dynamics are getting in the way of us accessing the broadest possible range of opinions and inputs? Could we be running those meetings differently and getting better results?
  • Finally, what are those two questions forgetting to consider?

Big Circle discussion notes

After three rounds of ‘Café table talk’, we rolled the tables out to the edges of the room and created a circle of chairs (there were about forty of us), and continued the conversation in that mode for about 25 minutes. I’m not going to report this blow by blow, but highlight some ideas put forward, as well as comment on the process.

It’s worth pointing out that the dynamics of discussion in the larger group were (as one might expect) very different from in the small groups. Some people said a lot, while about half said nothing at all. For the first nine minutes, about ten people spoke, and all were men. There was a tendency for one person to make a point that was reacted to by another person and then another and so on, in a ‘chain reaction’, even if that meant we drifted off topic. For about five minutes, the tone across the room got quite adversarial. So while the technique of making a big circle does help to gather in what had been thought across the table groups in a Knowledge Café, it can have its demerits or perils.

Meeting management methods. Steve Dale mentioned that at the LGA’s Improvement and Development Agency, there was a manager who used to take his team out on a walk – about ten people – and they talked as they walked. People wondered how practical that was! David Penfold suggested that if they walked and talked in groups of three, then they could stop at the pub and have the whole-group session – a Knowledge Café on legs!

Steve also pointed out that in some meetings – with fixed time and a business agenda – a free-flowing conversation would waste time and get in the way. Various people noted that one could loosen up thinking with a Café-style session or brainstorming, and follow that with a decision-making meeting – preferably after a fallow period for reflection.

Someone outlined a method she finds useful for eliciting a wide range of contributions. Pose an issue and get people to reflect on it individually for a while, in silence; then ‘pair and share’ with one other, to practise articulating your own ideas and also listening to others. Then you can progress to groups of four; then feed back from the groups to the whole assembly. When you are divided into small groups, we noted, the dominant types can only dominate a small group!

Dominance in group situations. Gender dominance or imbalance can affect the dynamic in discussions; so could dominance by age or ethnicity. Clare Parry spoke of occasions when someone from a minority makes a point and it is ignored; then someone from the majority says it, and suddenly it is a fantastic idea. These biases might be subconscious; but a younger person thought that discounting the opinions of younger people could actually be a quite conscious bias, based on the opinion that older people are more likely to know what they are talking about.

Bad paradigm shifts and entrainment. I (Conrad) thought it would be a mistake to think that paradigms always shift in the right direction! An example might be an assumption that information management is something that computer people do… We debated for a while whether this assumption was as widespread as 20 years ago: opinion differed.

Dion Lindsay, in his work around both information and knowledge management, finds that information professionals make a huge assumption that they are the best people to lead an organisation’s efforts in knowledge management. They see a continuum between librarianship, information management and knowledge management – which is not how the rest of the organisation sees things. And that, he said, is an example of entrained thinking (on both sides, perhaps).

Unfortunately, but predictably for this NetIKX crowd, this issue of IM and KM and IT set off a long series of exchanges about the rights and wrongs of managing information in a technology environment, which strayed right off the point – and got quite heated!

Influencing culture from the top down. One table conversation speculated that if a bunch of people at board level have got stuck in a rut with a particular way of doing things, it could be mitigated by bringing in someone with different thinking – like a football team appointing a maverick football manager to shake things up. On the other hand, this could backfire if ‘the old guard’ react by resisting and subverting the ‘outsider’.

An open, learning culture. Stuart Ward argued that organisational culture can be a positive influence on how decisions are made  – if the people at the top visibly promote diverse thinking by asking people for inputs and opinions. Nor should people be penalised for making mistakes, if the result is learning that can be shared to improve future results.

We came to no shared and agreed conclusions – but that’s not what a Knowledge Café does.  Everyone took something different away in their heads.

Posted in Uncategorized | 3 Comments

Survey Results

Naomi Lees, NetIKX Membership Secretary writes:

Thank you to everyone who responded to our NetIKX survey earlier this year. We had some very interesting and useful responses.

Here is a brief overview of the points raised, and what NetIKX plans to do over the next 12 months:

Programme Planning

We had some very useful feedback on the seminar topics you would like to see, especially around the future and value of KIM; as well as practical KIM tools and techniques. You will be pleased to know that we will be covering all these aspects and more in our programme in 2017 and early 2018, so check www.netikx.org/events or https://netikx.wordpress.com/events/ for details of future events.

We also had some other useful suggestions for future seminar topics, which our programme planner has taken away for further cogitation! Watch this space for further updates.

Events outside London

You said that you would like to see more events outside London – we are currently looking at ways we can make this happen. If you are keen to host an event outside London, please get in touch.

Partnering with other KIM Groups

We had some very encouraging feedback on developing partnerships with other KIM groups. You will be pleased to know that we have a KIM Communities event coming up soon. We are always interested in building connections with other KIM groups, so please get in touch if you have any ideas for joint-working.

NetIKX Website

We received several comments on the website and we are really grateful for this feedback. You will be pleased to know that we are currently working on a new website, with lots of the features you have asked for, such as more KIM resources and the ability to make electronic payments.

The survey results can be viewed here: https://www.surveymonkey.com/results/SM-VZXFF523/

 

Posted in Uncategorized | Leave a comment