What’s the point?

Center for Information and Communication Techn...

Image by whiteafrican via Flickr

There is no question that there is a surging urge to digitise. But what inspires this? What is the point of all this activity? There are a number of conversation strands to this topic, and I look at some of them here.  It is likely there are more, and others.  What does emerge is that there are professional, philosophical, economic and possibly even cultural differences in approach to digitisation, and these are by no means consistent or consensual. In fact, some of the drivers for digitisation seem to be using the same means to achieve quite different ends.

1. The first and perhaps most obvious inspiration for the digitisation of the world’s documents and cultural artefacts finds its origin in the zeitgeist of the so-called information society: a zeitgeist, may it be said, which by now is surely rather old and tawdry, and exposed for the misconceived delusion that it is. We now know that all societies have always been ‘information societies’; that we by and large agree with Daniel Bell and Manuel Castells that the concept of the ‘information society’ is in fact but another stage of the capitalist industrial society, which encourages consumerism. We are aware that the notion of ‘globalisation’, in the way it is enacted by multinationals to exploit the poor and disadvantaged in favour of the rich, has some serious ethical questions to answer. We can also, quite quickly, dismiss the idea that technologies, in and of themselves, can create change or increase social development: it is the USE of them, and the PURPOSES for which they are used that will make the desired differences in the lives of individuals, communities and societies. This purpose, from the point of view of information professionals, is to assist in the communication of information (or ideas) between people. Alas: at the same time, there seems to be a parallel desire to keep populations ignorant or misinformed, at least by certain regimes: information flows are suppressed.

2. A second driver for digitisation is certainly economic. This has two aspects: firstly, digitisation and increasing use of information and communication techologies (ICTs) seems to be understood to be the way to create new jobs, new possibilities to make money and perhaps even fortunes. This aspiration was dashed at least once, with the dot.com bust in the 1990s: the only people who seem to be making money now are those who are selling the equipment – which needs to be constantly updated and replaced – and the software – although possibilities here seem to be limited with the increased availability of free software and, more importantly, Open Source coding systems. Some online endeavours are financially valued in strange ways, too, which are perhaps difficult to understand. The billions of dollars that Facebook is allegedly worth is, to my mind, a strange phenomenon. But there are still seemingly unlimited opportunities for online merchandising, marketing and retailing, and consultants in social networking marketing seem to be thriving.

The other side of the economic or financial aspect is the possibility for saving money and cost-cutting. This applies not only to the vending of virtual objects such as ebooks or online services (website hosting, for example), which cost little to store and maintain. The replacement of libraries by the internet seems to be a very real possibility for many governments dealing with the fallout from the Global Financial Crisis (GFC – which always, for some reason, reminds me of Dahl’s Big Friendly Giant – BFG). David Cameron‘s present regime in the UK is a good example of this: it extends to replacing large numbers of public servants whose work can, apparently, also be done by citizens using the internet. ICTs continue to be deified as saviours of the world, if one is to believe the rhetoric that is expressed in many government documents, particularly perhaps some of those emanating for the iEurope European Union’s digital economy initiatives.

3.  Digitisation of documents does, however, open doors that were previously firmly shut. The Open Educational Resources University  (http://wikieducator.org/OER_university#Core_initiatives_of_the_logic_model) campaign being led, to all intents and purposes, by Wayne Mackintosh, is a prime example of this. It uses the best characteristics of the ‘information society’ , such as globalisation, to reach scholars and teachers all over the world, in order to create and distribute university learning materials to those who live all over the world – not just in the rich parts – so that they will have access to tertiary education. Surely this is the only way forward, in this dimension? I have mentioned Open Source software; there is also an increased movement towards access to ideas that is possible in a digitised, virtual, networked, information environment: Open Access. This is particularly useful for the dissemination of scholarly information, as well as those documents that are required to support other roles in society, not forgetting entertainment. All of these possibilities, combined with the increasing mobility of ICT devices (smaller and cheaper) and wireless access, may perhaps lead to significant improvements in people’s lives. Some even say that ICTs facilitated the recent political changes we have seen in North Africa.

4. We cannot rule out the possibility that digitisation is also being stimulated by technological determinism. “Oooh! I want to build a twaddler! It’s new! It’s big! It’s shiny!” But what can it be used for? Does it help me? Will it last for ever? Do we need one?  Rather cynically, there does appear to be some of this in a few digitisation initiatives, which have lasted for only as long as the funding has been around – and there doesn’t appear to have been enough reason or purpose to continue the funding. While, for many reasons, I endorse and support – and am enthusiastic about – the purposes to which the digitisation of cultural resources and documents can be put, I am still more than a bit concerned about the long-term prognosis. ‘Digital preservation’ still appears, to me, to be an oxymoron. As well as this, as I have been saying for about two decades, the technology is still very primitive: I don’t think that our clever colleagues in computer science and technologies have come anywhere near to where their work might still take them. Regarding existing technologies as the ‘last word’, or even suggesting that things may stay more or less the same (simply because our imaginations fail us), could mean making a very big mistake indeed.

5. The last aspect of the enthusiasm for digitisation may be motivated by a desire for control (above and beyond any economic or financial considerations). Access to information (or ideas, which I find to be the most useful synonym) has always, and will always be, regarded politically, as ideas may be – and indeed often are – dangerous: at least to the status quo, and especially to those who would be upset or lose out if the status quo were to be disturbed. Paradoxically, digitisation simultaneously provides the possibility for loss of centralised control: the use of Twitter and Facebook in Egypt, for example, or perhaps as a slightly more exaggerated example, WikiLeaks and now UniLeaks (http://www.unileaks.org), which could be seen as serving as the conscience of contemporary society. Citizen journalism – and indeed all social media – are other expressions of this facility. Information, or ideas, no longer have to be sanctioned by those in power or positions of authority: anybody (even me) can say what they like and have the possibility of being heard all over the world. UKUncut ( http://ukuncut.org.uk/blog/26th-march—invite-your-friends) provides  but one example of this.  This may possibly be an unexpected outcome of (4) above: “We invented the twaddler but we didn’t realise it could be used like THIS!”.

Looking forward to hearing from you – and please post comment here and on the Wallwisher!

All the best as ever, wherever you are

S

Advertisements

We just aren’t sexy enough. Digital World, Part 2.

World Summit on the Information Society, Tunis...

Image via Wikipedia

Information professionals are not taken seriously.  In fact, it is doubtful if we are ‘taken’ at all: who ever thinks of us?  Were we a news-breaking element in the World Summit on the Information Society?  Are librarians, archivists, records managers and other information professionals regularly consulted when so-called ‘information policies’ are being created? (I put ‘information policies’ in inverted commas as they usually have to do with either technology or economics, rather than information itself).  Why is all the theory that already exists in the field – in information retrieval, user behaviour, learning, categorisation and so on – steadfastly ignored, only to be laboriously reinvented when required? We are concerned with ‘upliftment’ and ‘preservation of cultural heritage’ and ‘corporate memory’.  We are anxious that people don’t ‘know’ enough, and give them more than they need to know (something I myself have been rather guilty of in this blog, flooding you with my opinion).  We understand the consequences of losing or destroying documents, and so are preoccupied with preservation and conservation. Anybody concerned with technology is a ‘geek’: a male with gross personal habits and no social habits; usually with pimples and a paunch from pizzas, he performs magic at his keyboard which is totally incomprehensible to a non-geek. We are doing things the wrong way.  We are not taken seriously because we take ourselves too seriously. We are out of line with the current cultural environment, in which everything is easy, quick, attractive. With all our talk of ‘the user’, we are making the classic mistakes that Mrs Thompson made with me in Grade 8 Mathematics: (a) she assumed that I understood what quadratic equations were; (b) she thought I cared and (c) she assumed I loved mathematics and would exert myself to overcome the obstacles that stood in my way.  She was wrong on all accounts. We must start with where the user is.  The user spends his/her day in a world of unemployment, recession and mobile technologies.  S/he is stifled in an oppressive regime, in a world of immeasurable opportunity, has access to anything or everything or nothing; suffers inequities of gender, sexuality, religion, race, class or political persuasion.  Even the ‘good’ countries are flawed:  look at income distribution in the US :http://motherjones.com/politics/2011/02/income-inequality-in-america-chart-graph.  His/her favourite activities are playing computer games, abusing alcohol and drugs, living an alternative lifestyle for the planet, or struggling to survive.  S/he can’t believe what s/he reads (http://www.guardian.co.uk/media/2011/feb/23/churnalism-pr-media-trusthttp://blogs.reuters.com/gbu/) or hears (Gulf oil is not a fossil fuel: http://www.youtube.com/watch?v=ck01KhuQYmE; New World Order and the US Federal Emergency Management Agency: http://www.youtube.com/watch?v=Xd9NX8dPE1I); US military is spraying chemicals into the air: http://www.youtube.com/watch?v=V_zaCpVj_jc – pick your own conspiracy theory) but often doesn’t know the difference, or doesn’t even care.  They are downloading all the movies and TV shows they want to see using Utorrent or BitTorrent.  Do they care about breaking copyright laws?  Doesn’t look like it. Can we compete when most of our research looks at relatively sophisticated educated people: students, scholars and academics?   How much information information is communicated on any given day in any given organisation that changes the way the organisation works?  How many students request advice from a librarian for an essay?  How many governments consider their political legacies in terms of the documents they leave behind? How can we get a new healthy outlook and a sexy new approach?  We need to reinvent and rejuvenate.  We will also need to increase our numbers, and our specialisations, enormously.  The digital information environment is throwing up challenges that we haven’t even started to consider, as we plan and strategise for a digital future. Do you agree?

How to cope in the digital world. Part 1.

Is this in the future for information professionals?

Now that we are well on our way into 2011, I’ll dare to suggest some of the things that I think we will have to think about in the near future.  I did once acquire a crystal ball, but it didn’t work: I therefore offer no predictions, but rather some thoughts on what seems to be going on at the moment, focussing on the possible effects on the information management professions.  I will mention some of these each day for the next couple of days.  Please do not hesitate to comment, as well as to add issues and phenomena that are important in your field of endeavour.

Multifunctionality and convergence

We have seen, for more than a decade, increased multifunctionality of information and communication technologies (ICTs).  The phone is now a camera, voice recorder, workout monitor, letter writer, internet accesser, aide de memoire and map finder, as well as other things.  This is a continuation of the development of computers, which were soon used for a lot more than just arithmetic and calculation.  The social media and search engines are moving in the same way: FaceBook does email, Bing integrates FaceBook data, FaceBook can also be used to a member of various blogs and webistes of interest.  Google appears to positioning itself to run the Googleverse, as it develops its own versions of popular software – such as email and wordprocessing – as well as interesting additions such as Skype, blogging, Flickr and, of course, the library: Google books.  And then there’s Google Scribe, which anticipates what you are going to write; Google Body, which allows you to peel back, layer by layer, the human body; and Google Goggles, which enables you to search Google using pictures from your smartphone.

I posited previously (2002) that converging technologies have led to increasing convergence between the information professions: will this continue?  I believe that this would be desirable, but whether it is practicable and attainable is, of course, a different matter.  The arguments for increased convergence – or at least collaboration and multidisciplinary interaction – include a stronger public presence and perhaps more political clout (within organisations and communities); sharing of solutions to problems which have perhaps been located within particular disciplines/professions, but which are experience by all; recognition of the similarities, rather than the differences, of the challenges that face the information professionals.  Some of the more complex issues that must be dealt with include the retention of professional profiles, as each discipline/profession has unique characteristics and different contributions to make; the plethora of professional associations, all of which require membership fees and produce newsletters and journals that must be read; and lastly, the overwhelming number of subdivisions that can be identified in this enormous field.  Too much ‘multifunctionality’ can be diffuse – Jack of all trades, master of none.  But such demands are presently made on us: just consider the number of different tasks that must be executed in the role you currently occupy.

Social networking and user-generated content

The appearance and ongoing development of Web 2.0 appears to have no end.  In the analogue world, because of the relatively tedious ways in which documents were created and distributed, more control was possible, perhaps because of necessity.  Documents were not created or published unless it was necessary for whatever reason.  Publishing procedures were closely linked to bibliographic control systems: ISSNs, ISBNs, in book cataloguing information, edition statements and so forth formed part of a vast mechanism.  But even in the 1980s and before, people complained about information overload.  Then the internet appeared, and information professionals groaned: how on earth were we going to manage this flood of documents?  It appeared that every Tina, Dorothy and Helen could publish whatever they liked.  We didn’t even know what was out there, never mind trying to keep up with classification and cataloguing.  And then Web 2.0 happened, with amazing social possibilities.  The hallmark of this version of the internet is user creation and interaction.  Barthes mentioned the ‘death of the author’, in the sense that each reader will recreate an author’s text, an idea explored also, in some detail, by Umberto Eco in his ‘The open work’.  The death of the Author, with a capital A, has another interpretation now: the Author does not have to condoned, approved, validated, lionised or even recognisable to be able to publish as much as s/he wants to.

Part of the problem for the reader is being able to contextualise the author, in order to draw meaning and fully understand the ideas that are being conveyed.  The Author is no longer automatically an ‘authority’ (“I read it in a book so it must be true”): far more sophisticated skills are required in order to select, understand, analyse and critique the information with which we are now overwhelmed.  This is sometimes called ‘critical information literacy’ which is quite different from the ‘information literacy’ that librarians used to know and love.  In fact, it might almost be called ‘critical media literacy’ or, the term I currently prefer, ‘Critical Digital Literacies’.  All the technology in China – and the rest of the world – will not help us one jot if the general population does not develop these skills.  I believe that we, as guardians of memory and cultural heritage, are the very people to undertake this.

Increasing epublishing and ereading means, at the very least, familiarisation with the tools that are required is necessary.  Does this mean the end of publishers?  How does it change the publishing cycle?  There have already been huge shifts in educational resources and scholarly communication patterns (more on this at another time); Open Access and Open Source are widely used and increasingly popular.  This will have, perhaps, the greatest impact on poor countries – but what will the nature and consequences of this be?

Consider the rise of civilian journalism.  I grew up in an environment in which it was natural to doubt every word on the radio or in the newspapers on current events; we needed to understand that we were being fed half news or even no news at all.  Sadly, in environments were ‘free speech’ is protected by law, too many accept that what news is being reported, and what comments are made on it, is both important and authentic.  The ways in which journalism (‘churnalism’ is a new aspect of this – see www.churnalism.com) and the media operate is accepted as part of the transparent background.  Civilian journalism empowers ordinary people to report directly on what is happening: this, enhanced by Twitter and Facebook, provide different interpretations and views.  It can be said, therefore, that in this regard, the internet is like Foucault’s Bibliotheque Fantastique: a place where we go to discover ideas and to have them challenged.  The new heroes are, if you like, at the bottom of the pyramid, in terms of sheer number, at least.

The other aspect of this is that printed newspapers are likely to shift to online only.  An advantage of this for individuals is that they can use push technologies – news aggregators such as RSS feeds – to deliver only the bits they want to know about.  And then there was Twitter – and now, for those with iPad tablets, FlipBoard, which allows you, effectively, to create your own magazine.

As information professionals, what are we going to do about this?  How will we manage and encourage access to all these ideas?   A Sisyphean task, seemingly.  How can our knowledge and skills be used?  How can we access and use user commentaries and annotations?  At the same time, we must ask, “Who is NOT using the internet?  Who is NOT publishing their ideas?”  This group may include anyone from serious scholars to the illiterate and disadvantaged: whose voices need to be heard?  Should we have any involvement with this – knowledge creation and distribution?

The rise of secret gardens, or, the Splinterweb.  Social networking is all well and good, but perhaps the hysteria is now over: do we all want everybody to know our every move, our ever mundane and trivial thought?  And let’s not mention the time it takes to pursue this triviality.  It seems that people are becoming more selective, perhaps more discreet and attempting to use their internet space and time more meaningfully.  This would suggest not only targeted audiences, but a judicious and discriminating approach to who can see what.  There is little doubt that, with the emphasis on intellectual property (note for example the astronomical number of patents that are being applied for and approved), most knowledge creators/publishers wish to protect and preserve theirs.  So, while a considerable portion of the internet will remain public and open, increasingly we are likely to see inaccessible areas.  Costs may be involved, too.

I would be very interested to hear what issues you believe confront us at this juncture.

All the best

Sue

Archie Dick on the political role of libraries

I understand, like Foucault, that power/knowledge are inextricably joined – not so much that ‘knowledge is power’ (otherwise we would be the most powerful people on the planet), but rather the ways in which knowledge is created, and the ways in which it flows through organisations and society, demarcate routes of power.  And those who control or influence knowledge creation and information flows have a decidedly political power – which is why libraries are the first to go as authoratarian and dictatorial regimes assume control.  Ideas are dangerous: if nobody’s said that already, I’ll say it here.

My good friend, and a man whose work I greatly admire, is fellow South African Archie Dick.  In this short clip (four minutes), he talks a bit about the role of libraries during the Apartheid years in that country.  Food for thought.

http://www.youtube.com/watch?v=TNY_B4raML4

Enjoy your day, wherever you are.

All the best

Sue