2008/12/06

Mircosoft Troubles - IV

Previously I've posted on my conjecture that Microsoft will hit turbulent financial times in 2010: Microsoft Troubles III, Microsfot Troubles II and Microsoft Financial woes in 2010

This article in CNN Money/Dow Jones Newswire cites data on some of the early effects becoming apparent.
Sales of Windows grew just 2% in the first quarter of fiscal 2009, which ended Sept. 30, 2008. In most years, Windows posts double-digit revenue growth, according to company data.

2008/11/30

Finance, FMAA & ANAO and Good Management: Never any excuse for repeating known errors

In light of the Sir Peter Gershon's Review of the Australian Government’s use of Information and Communication Technology, here's an email I sent to Lindsay Tanner (Finance Minister) prior to the 24-Nov-07 election of the Rudd ALP government. Edited lightly, formatting only.

Date: Sun, 11 Nov 2007 15:02:40 +1100
From: steve jenkin 
To:  lindsay.tanner.mp@aph.gov.au
Subject: Finance, FMAA & ANAO - Good Management: Never any excuse for repeating known errors

Here is something very powerful, but simple to implement & run, to amplify your proposed review of government operations and can be used to gain a real advantage over the conservative parties. On 8-Nov I wrote a version via the ALP website.


Headline:
The Libs talk about being Good Managers, but they have been asleep at the wheel for the last 10+ years.

It's not "efficient, effective or ethical" to allow public money to be wasted by repeating known mistakes.

Nothing new needs to be enacted - only the political will to demand Good Governance from bureaucrats and the 'ticker' to follow through.


2008/11/29

Gershon Report - Review of Australian FedGovt ICT

The Gershon Review is good solid stuff that doesn't rock the boat, doesn't challenge current methods & thinking, nor show deep understanding of the field.

It has a major omission - it addresses ICT inputs only.
ICT is useful only in what it enables others to do or improve - measuring & improving ICT outputs is completely missing from 'Gershon'.

It doesn't examine the fundamentals of ICT work:
  • What is that we do?
    How is Computing/IT special or different to anything else?

  • Why do we do it?
    Who benefits from our outputs and How?
Here are my partial answers to these questions:
  1. Computing is a "Cognitive Amplifier" allowing tasks to be done {Cheaper, Better, Quicker, More/Bigger}.

  2. IT is done for a Business Benefit.
    Like Marketing, defining how outputs & outcomes are measured and assessed - both in the macro and micro - is one of the most important initial tasks.

Gershon doesn't address outstanding issues of the IT Profession:
  • improving individual, organisational and general professional competence and performance.
  • Reducing preventable failures, incompetence/ignorance and under-performance.
  • Deliberate, directed & focussed effort is required to institute and maintain real Improvement of the Profession. (vs 'profession-al improvement' of practitioners)
After ~60 years of Commercial Computing:
  • Are there any new ways to stuff things up?
  • Is it "efficient, effective, ethical" to allow known Errors, Mistakes, Failures to recur without consequences? [see FMAA s44]
It isn't like the Government isn't aware of the processes and instruments needed to avoid repeating Known Errors, nor the benefits of doing so.

Aviation is controlled by ATSB (Australian Transport Safety Bureau, previously Bureau of Air Safety Investigation [BASI]) and CASA (Civil Aviation Safety Authority). The USA's FAI publishes hard data on all aspects of Aviation - and mostly they improve on every measure every year. This isn't just due to the march of technology - the figures for 'General Aviation' (as opposed to Regular Passenger Transport) plateaued decades ago... This is solid evidence that Aviation as a Profession takes itself seriously - and that commercial operators in one of the most competitive and cut-throat industries understand the commercial imperative of reducing Known Errors.

Aviation shows that profession wide attention to Learning and Improvement isn't just about Soft benefits, but translates into solid business fundamentals. You make more money if you don't repeat Know Errors/Mistakes.

ATSB investigates incidents and looks for Root Causes.
CASA takes these reports and turns them into enforceable guidelines - with direct penalties for individuals, groups and organisations. CASA is also responsible for the continual testing and certification of all licensed persons - pilots, Aircraft Engineers, ...

There are 4 specific areas Gershon could've included to cause real change in the IT Profession - to start the inculturation of Learning & Improvement and the flow-on business gains.
Federal Government accounts for 20% of total Australian IT expenditure. It is the single largest user and purchaser of IT - and uniquely positioned to redefine and change the entire IT profession in Australia.
  • Lessons Learned - Root Cause Analysis of Failures/Problems
    Dept. Finance 'Gateway Review Process' on Projects.
    Needs equivalent of CASA - inspection and enforcement of standards plus penalties/sanctions - Not just reviews and suggested guidelines.
    Not just ICT staff, not just FedGovt but their suppliers/vendors/contractors as well.
    Without real & timely (personal and organisational) consequences, nothing changes.

  • Standish 'Chaos Report' equivalent - real stats on IT Projects.
    Without solid numbers, nothing can change.

  • Operational Reviews.
    How well does an IT organisation do its work?
    Critical Self-assessment isn't possible - exactly the reason work needs to be cross-checked for errors/mistakes/omissions/defects.
    C.f. Military Operational Readiness Reviews - done by specialist, impartial experts.

  • Individual Capability Assessment - equivalent of on-going Pilot etc recertification.

  • Research: Quantifying & standardising metrics and models for "Effectiveness".
    DCITA/DBCDE on macro-economic results.


The ACS describes Gerhon's recommendations as "all aimed at addressing the efficiency of ICT":
  • governance,
  • capability,
  • ICT spending,
  • skills,
  • data centres
  • sustainable ICT
Note the issue of Reducing Faults/Failures/Errors/Mistakes doesn't make the list.
Nor does the idea of institutionalising the building/improving the Profession of IT and increasing the Capability/Performance of IT Professionals.

By the DCITA/DBCDE own reports, ICT contributes 75% of productivity improvements: ICT is still the single greatest point of leverage for organisations reducing costs and improving output.

Does getting IT right in Federal Government matter?
Absolutely.

Gershon delivers 'more of the same' and could conceivably achieve its targets of 5% & 10% cost improvement

2008/07/15

Bad Science or Science Done Badly?

Is 'Science', as practiced by Academic Researchers, executed poorly?

More specifically:
Is the practice of Research as undertaken by Academics, as effective as it could be?

This posits that an aspect of "Professional Research" is intentionally increasing your capability and effectiveness.

Computing/Information Technology is a Cognitive Amplifier - exactly suited to central parts of "Professional Research" - e.g. learning, recalling and searching published papers and books.

If an individual researcher can increase their "knowledge uptake" just 7% in a year, after a decade they know twice as much, given uptake builds on existing knowledge.

What is Research about if not Knowledge: Gathering, Analysis, Representation, Taxonomy/Ontology, Management and Communication?
This field began in 1995 and is broadly known as "Knowledge Management".

2008/05/28

I.T. Strategic Planning Failures

Sue Bushell asked on "LinkedIn": What are the most common failures in strategic IT planning and how are these best avoided? What best practices in strategic planning are most effective?

My answer:

1. There are no I.T. projects - only Business Projects.
Hence changing the premise of your question:
What are the most common business process failures around I.T. solutions?
[A: Make the business run the project and take the rap if it fails.]

2. I.T. is an Industry, not a Profession.
Proof: Professions Learn: repeating Known and avoidable Errors/Mistakes isn't consequence free, as it is within I.T.

3. The complete lack of History in I.T. - both on macro and micro scales.
  • Show me any large organisation that can even list all its current projects, which is a necessary starting point for:

  • Formal "Lessons Learned" from projects and operations - known problems are avoided, known effective practices are used.

  • Jerry Weinberg wrote definitive works on Software Quality Management and 35 years ago proved that focusing on Quality results in better code, written far faster & cheaper. And it is much more reliably and consistently produced!

  • Jim Johnson of Standish Group, nearly 15 years ago started definitive research on what proportion of IT Business Projects fail and the causes of failure. This work is fundamental to advancing the Profession - but nobody else studies this field so his results can't be verified or refuted. Nor have organisations or practitioners, by-and-large, acted on this knowledge. People do argue that his results are suspect because other single-shot reports don't agree. But nothing happens to resolve this fundamental issue!

  • Software ReUse is notable in how little it is practiced. Can it be possible that nearly ever problem is completely new? Not in my experience.

4. The fundamental reason IT is used: It's a "cognitive amplifier".
Computing amplifies the effort and output of people, providing results 'Cheaper, Better, Faster'.

On the micro scale, no organisation I've heard of measures this. It's quantitative and should be calculable by any half-reasonable Management Accountant.

On the macro scale, the 'Profession' doesn't have or publish benchmarks on results (i.e. from across many organisations).

5. The 'Profession' doesn't even have a taxonomy of jobs and tasks, let alone any consistent method for evaluating and reporting the competence of, and skill level of, practitioners.
  • In a construction project you wouldn't specify "10 vehicles needed", you say "6 5-tonne trucks, 2 utes, a 20-tonne tip-truck and a bobcat".

  • If the profession can't distinguish between the speciality, competence and skill levels of its practitioners, how can the business folk?

  • If project plans don't identify the necessary the precise skills needed - implying some way to assess and rate the 'degree of difficulty' of individual tasks/components - then the right 'resources' can't be applied.

6. The almost complete disconnect between research results and practice. Enough said.

7. [Added]. The general capability of the Profession in general and young I.T. practitioners has declined greatly.
Proof: The increasing number of failed projects attempting to replace 'Legacy Systems'.

E.g. The failed A$200M Federal Government ADCNET project. I worked on the original IBM mainframe system, then found myself 15 years later sitting in the same awful basement not 50 feet away, coding it's replacement. The IBM system took 30-35 man-years (in structured assembler), just the second phase of the ADCNET system had a team of 70 for 1-2 years - and was abandoned. The best description of it is the Federal Court Judgment:
GEC Marconi Systems Pty Limited v BHP Information Technology Pty Limited
Federal Court of Australia
12 February 2003 and 14 July 2003
[2003] FCA 50; [2003] FCA 688

8. [Added] Creating Software is a performance discipline.
You have to both know the theory and be able to create good software.
Who are the Great Heros of Open Source? The guys that demonstrate they can code well.

Like Music, Surgery and Architecture, software requires head and hands to do it well.


9. [Added] Design is Everything.
This is what the Bell Labs Computing Research guys understood and what Microsoft doesn't. They invented the most cloned Operating System in the world - Unix, and then went onto build Plan 9, it's replacement 20 years later - with around 20 man-years. It was created portable and scalable, running on 6 different platforms from day 1. Of course it was incredibly small and blindingly fast. Time has shown it was robust and secure as well.

Not an accident that 15 years later Microsoft spent around 25,000 man-years on 'Longhorn', and then threw it all away! (The infamous 'Longhorn Reset' on 23-Sept-2005 by Jim Allchin)
Then spent the same again to create 'Vista' afresh from the 'Windows Server 2003' codebase.

How could Microsoft not understand what was well known 15 years prior, especially as Microsoft ported Unix to Intel in 1985?


There's more, but that will do for now.


"I.T. Governance" may be part of the Solution, but standards like AS8015 are primarily aimed at allocating blame or pushing all responsibility for failure onto I.T. and abnegating from I.T. any successes.

The 'root cause' of all I.T. failures is trivial to identify, but probably exceedingly hard to fix. These days, almost no projects should fail due to technology limitations - only practitioner and management failures.

The 'root cause' is: Business Management.

Yes, there are many problems with I.T. practitioners, but think about it...

Around 1950, Commercial Computing was born.
Some projects worked, in fact succeeded brilliantly: Man went to the moon on the back of that work just 2 decades later.

And then we have the majority or 'ordinary' projects that fail to deliver, are abandoned or under-deliver...

The first time 'management' commissioned a bunch of 'Bright Young Things' to build The Very Best Computer System Ever, they would naturally believe the nerds and their self-confidence.

After that effort failed, what would the rational approach be to the next project?

Not the usual, "do whatever you want and we'll see", but "you didn't do so well last time, how about we try smaller pieces or doing it differently?"

And when lining up for the third go-round, you'd think competent business managers (the ones writing the cheques) would put the brakes on and say "you haven't shown you can deliver results, we have to manage you closely for your own sakes."

"Fool me once, shame on you. Fool me twice, shame on me."

And who's the cause on the third, fifth, hundredth or thousandth repetition?
The people who keep paying for the same 'ol, same 'ol.




2008/03/09

Videos on Flash Memory Cards - II

My friend Mark expanded on my idea of "HD DV being irrelevant" - like phone SIM's, video stores can sell/rent videos on flash cards (like SD) sealed in a credit-card carrier.

The issues are more commercial than technical. 8Gb USB flash memory might hit the A$50 price point this year - and A$30 next year. There is a 'base price' for flash memory - around $10-$15.

This inverts the current cost structure of expensive reader/writer and cheap media. Which is perfect for rental/leasing of media - a refundable 'media deposit' works. An added bonus for content owners is a significant "price barrier" for consumers wanting to make a copy. If a 'stack' of 100 SD cards costs $1500 (vs $100 for DVDs), very few people will throw these around 'like candy'.

Mark's comments:

Y'know, the more I think of it, the more the SD-embedded-in-a-credit-card has a lot of appeal when the availability and price point for 8Gb SDs is right. It makes it easy to print a picture, title and credits/notices etc on the 'credit card' - something big enough to be readable and a convenient display format and, as you say, nicely wallet-sized. Snap off the SD and you've agreed to the conditions etc, plus the media is now obviously 'used'.

It's a useful format for other distributions too - games, software, etc (Comes to mind that SAS media still comes on literally dozens of CDs in a cardboard box the size of a couple of shoe boxes).

My complete collection of "Buffy" would come in something the size of a can of SPAM or smaller, rather than something the size of a couple of house bricks for the DVD version, or something still the size of a regular paperback for the Blu-Ray version. For collectors of such things, the difference between having many bookshelves taken up by the complete set of Vs a small box of credit card (or smaller) sized objects is significant. The ability to legally re-burn or replace and re-burn the media when it fails is critical though.
SJ: Because of the per-copy encoding to a 'key', stealing expensive collections isn't useful, unless the key is also taken. So those 'keys' have to be something you don't leave in the Video player.

You've covered the DRM aspects and better alternatives to DRM - which also means that I can burn and sign the media I might produce and distribute myself without needing to involve the likes of Sony or Verisign - although that is possible also - which protects the little producer. Include content in Chrissy and Birthday cards - you've seen those Birthday cards with a CD of songs from your birth year - why not a sample of the movies from that year, plus newsreels etc. Good for things like audio books - whole collections. And if the content on an SD gets destroyed, as long as the media is OK, it would be possible to re-burn it. Most current DVD players now also have SD readers as standard.

Surely someone has thought of it already! Part of the attraction of DVD over storing your library on a 2TB USB disk from Dick Smith is the problem of backups. DVD is perceived, incorrectly, as permanent storage. Though I notice some external USB drives now have built-in RAID 1 or RAID 5, but Joe public doesn't see the need (how come I bought a 2TB drive and I only get 1TB?).

Yeah, I think the proposition that SD or similar will become the ubiquitous preferred standard portable, point-of-sale, recording and backup storage media for photos, movies and music, has some credence. There is something to be said for - "you pick it up in your hand; you buy it; it's yours" - over - "downloading and buying some limited 'right to use' ".

2008/03/07

Service Desk and Politician e-mail

Over the last year I've penned 6+ e-mails to various Labor Party politicians - including one of my local representatives who've I dealt with for ~10 years.

And not one reply. Zero, Zip, Nada...

Rang the Good Person's electoral office today - and got various run-around responses. "Oh, I've been on holiday", "Oh, can they call you" and "they are booked solid for a month".

Yeah, right.

I first contacted my rep. last December saying "this can wait until after the School Holidays". January came and went, no reply... A follow-up email yielded nothing... A note to the support staff was replied to: "I've moved. XXX is responsible".

What I originally wanted to talk about was 3 emails I'd sent various members without even getting acknowledged. Which is strange, because in the media I've seen reports that Political Parties are now tracking every contact from a voter. Putting together, apparently, impressive profiles - and all completely legit under the Privacy Laws.

For a new Government this seems a pretty poor response, doubly so for one that prides itself on 'listening'.

The solution that I wanted to put forward to my Rep:

Use HelpDesk Software to manage constituent contacts.
Not just piecemeal, but an integrated system for all participating elected members.

Not all that hard.
It scales. It goes across the whole Party. It covers both 'aph.gov.au' contacts and via other email addresses. It copes with email, phone, fax, mail and personal contacts - and the worst of all "voice prompt systems".

The software is well known, there are many vendors and trained consultants and the marketplace is competitive. As consumers and office workers, most of us are used to the concepts and who these systems all work.

It creates a definite process - with self-imposed rules & priorities that are checked and enforced.

AND it ensures that little people like me don't just fall between the cracks.
Or if some 'critical person' falls down - work queues can get given to those who can best deal with them.

Imagine getting a tracking number back from your local Pollie, and being able to automatically check where it is up to - and just when you should expect an answer. Wow! Just like they worked for us and were trying to use the technology responsibly...

It would do a service for our erstwhile representatives - you know, the ones we pay to work for us:
  • They could become more efficient - by delegating work, not needing to deal with "whatever happened to" requests, and identifying common themes and selecting the most efficient way to respond.
  • They could make a very exact case for additional clerical support from the Parliament - or even have a pool of paid staff doing the grunt work.
So I'm not holding my breath while waiting for anything different to happen.

The Internet Changes Everything - but Politicans and their ways.

2008/03/06

Who cares about HD DV?

Talking to a friend at lunch today, the topic of "Blu-Ray" vs "HD DV" formats came up...

I think "Blu-Ray" may take the market, but it won't be much of a market.
There are just too many competitors for moving around video files:
  • DVD format disks - still good for 8Gb (dual layer). Drives & media are cheap.
  • flash memory - 2008 sees A$50 for 8Gb on USB (less on SD card)
  • A$300 for 750-,1000Gb USB hard-drives. Under $1/DVD.
  • Internet download. With ADSL 2+ giving 5-10Mbps for many.
My thesis to my friend was "Video stores may well go for SD cards". Pay a refundable deposit for the flash card, and a fee (rental or ownership) for the content. Video stores can pre-burn large numbers of movies - and if you want a 'special' - they can make one for you in 20 minutes.

His response: "they could package them like SIMs - in a snap-off credit card-sized holder". Which is better than any idea I've had on packaging.
And it fulfills the most important criteria:
fits comfortably in a pocket (now a wallet)


Practical problems:
  • How to stop people copying the flash and resealing it?
  • Some sort of effective copy-protection system would be good.
  • Flagging 'ownership' or usage conditions of a movie. Not so much DRM, but 'this is property of XXX'
These problems can be nicely solved by users having their own "Key Card" with a digital identity and an encryption key.

The flash needs a 'fuse' that is broken when the card is freed. Preferably an on-chip use counter that can only be factory reset.

To issue a movie to a customer, the encoding key of the video (if present) would be combined with the users key - and the resulting unique key written on the card. Players need both the card and user key to decode and play the movie.

That same process also tags the card with the current owner.
You lose it, it can come home to you.

Because the content can be locked to a particular ID, the raw content can be stored on disk without the movie studios giving away their birth right.

Summary:
I think 120mm disks are going to follow the floppy disk into the technology graveyard.
They will have certain uses - like posting something on cheap, robust media.

With the convergence of PC displays and Home Theater, the whole "Hi-Def TV" problem is morphing. Blu-Ray - can't wait to not buy one.

2008/02/08

The Open Source Business Model

This post by Dana Blankenhorn on ZDnet is the best answer I've seen to the question "Why Open Source?".

He says 'plumbing', I'd say '(Unix|Open Source) is the Universal Glue'.
And the on-going Open Source Business Model is "support" for those that need/want 'certainty'.

Which if you are the CIO (read: 'my arse is on the line') for somewhere with a high dependence on I.T., is only Good Governance (or "common sense"). You can't make key staff stay, nor mandate they never get sick or burn-out and "go sit on a beach" - and after '9/11', all Business Continuity plans have to account for covering people as well as systems and networks.

That's it - Business I.T. is all about the Data (or "all about XXX, stupid" to be Clintonesque).
Open Source tools are usually about manipulating data or providing services - like Apache, e-mail, DNS, firewalls and IDS, ...

Open Source is here to stay: use it, don't deny or fight it.

This Business Model, 'support for essential tools', is robust and on-going.
Whatever systems you use in the Data Center, you'll always have the need to provide many services and interface disparate systems and data formats.

The model also applies to embedded 'Appliances' and dedicated devices, like firewalls - or commercial web-hosting services. They are based in whole or part on Open Source.

You'll note this model has very limited application to the client-side - the 'Desktop' or End-User compute platform.

"Free Software" from GNU et al is about an ideological stance and subsumes all other goals to this.

"Open Source" is pragmatic and about getting on with the job. It makes sense for large vendors, like IBM and HP, to support it. Customers can feel confident and secure - because the source and tool-chain are freely available from multiple sites, they cannot be held to ransom or 'orphaned' by unpredictable events or capricious decisions.

"Open Source" starts from the premise that "IT is done for a Business Benefit" - that you build software, systems and services for the use of others, not your own amusement and benefit.

Business supporting software has to meet Professional standards/criteria - good design, clear documentation, reliability, robustness and very few errors/defects - with the unstated driver of Continuous Improvement.

Never new features for their own sake or to create 'forced upgrades', always making the code more stable, usable and useful.
Commercial considerations, by definition, are always subsidiary to technical. If the user community doesn't like changes - they aren't forced to upgrade and in an extreme case, can 'fork' the code, internally or publicly: just do it how they want.

2008/01/19

Human Response to Cognitive Workload

Context: This piece started as a question to a researcher in Psychology.

There's a topic I've been trying to find research results for some time.
I call it "(Human) Cognitive Response to Workload".

There is a bunch of qualitative data for "Physiological response to Workload" available - e.g. US Navy for "stokers" working in different levels of heat.

I found Prof. Lisanne Bainbridge in the UK. She's retired now. Her field is 'mental load' and couldn't point me at research in the area or help me properly phrase my question.
She pointed me at Penny Sanderston, now Prof. at University of Queensland.

What I'm interested in is any information to apply to Software Developers and other Knowledge workers:
  • In the short, medium & longer term (day, week, year) how do you maximise cognitive output?
  • What roles do sleep, recreation & holidays play in 'recharging' cognitive abilities?
  • For different levels (degrees of difficulty) of cognitive task (or skilled manual task) what are the optimum work rates and duty cycles? (ratio of work/rest)
Related areas are Rates of Error & 'tiredness' effect on maximum cognitive task.
[James T. Reason has very good work on "Human Error" and "Organisational Error". His work is used extensively in Aviation and Nuclear safety. He originated "the swiss-cheese" model of accidents.]

2008/01/01

Solving 'Spam'

It never ceases to amaze me, the Politician attitude to Porn and 'Spam' & it's friend, malware.

Porn is "bad, bad, bad" and Pollies show very high interest - including policy & legislation.

Lots of angst & trashing around about eradicating something that 2,000+ years of writing/publishing shows can't be controlled/legislated away. The physical publishing world & (cable) TV show that the only effective is means of control is to allow-but-license.

Same as tobacco. Never going to eradicate it, only control it.

'Restricted Content' access can only be controlled iff:
  • every page is 'classified' at source (meta-tags),
  • an unforgeable Internet 'proof-of-age' card/system is created,
  • there are criminal penalties for subverting the system, forging identities or misclassifying pages,
  • there are no legal jurisdictions outside 'the system' [e.g. on the high-seas],
  • all browsers enforce 'the rules',
  • and browsers can't be built/written to ignore 'the rules'.
i.e. It is impossible to eliminate 'restricted content', and possibly provably so...