Russel Winder's Website

UK Government, Patents and SMEs

It seems increasingly clear that, despite all the rhetoric about small- and medium-sized enterprises (SMEs) being the centre of innovation, and at the heart of the economy, the UK government handles the issue of patents in such a way as to keep big companies happy.

In case you weren't already aware, a patent is a tool for a government to give an inventor a monopoly on exploitation of an invention, as long as the invention is documented and made public. Part of the "SME rhetoric" about patents is that it gives "the little guy" an opportunity to have an invention exploited by others whilst obtaining a revenue from that exploitation via licencing fees. Of course this is complete rubbish. Patents are a tool for big companies to use against their competitors - being in the patents game requires significant funds and only big companies have sufficient to play the game.

I recently spotted an article on The Register which referred to a PDF file of an open letter by the SME Innovation Alliance to the UK government. This is an interesting double damnation of the government regarding patents and innovation. Their main argument on patents is that SMEs do not have the financial muscle to enforce their patents, i.e. big companies can simply exploit the patented invention and not take out a licence, and there is damn all the SME can do. Basically big companies have enough finance to be able to ensure that an SME (or an individual inventor for that matter) runs out of money before any case comes to court. They can therefore steal an invention with equanimity.

Their second point is that the government should stop funding university research and instead given the money to SMEs for product development. Whilst there are some serious faults with the way "innovation" is funded by the UK government, not funding the universities is not the answer. However that is a different issue to the patents one.

Back to patents: So now even SMEs are highlighting the fact that patents are tools for big companies and that SMEs have little or no ability to work within the patents system. One is therefore drawn to one of two conclusions:

  • The UK government (and indeed the USA government as all the arguments apply the same, or in fact more) is implicitly (well explicitly now given the way they have responded to the SMEIA) colluding with big companies to ensure no SME manages to ruin the big companies' ability to control the economy.
  • The patent system should be scrapped as it is explicitly an "anti-trust" system, and therefore violates all the competition laws that are on the statute books.

None of this is explicitly about software patents, it is about patents in general, but if software became patentable in the UK as it is in the USA, it would be a clear indicator that the UK government is backing total control of the software industry by the big companies, excluding SMEs from any contribution. The UK government is proposing to update all the laws on intellectual property, will they favour SMEs or big companies?

Oracle, ASF, JCP, and Java

Ever since Oracle bought Sun, and became custodian of Java, the Java community (well the vocal element of it anyway) have been wondering what is going on. At each turn Oracle appears to have been going out of its way to annoy (euphemism!) the FOSS community.

  • The patent and copyright law suit Oracle has brought against Google is clearly an attack on an independent, non-licenced implementation of a virtual machine capable of running Java code. This has been taken as a full-frontal attack on Apache Harmony a FOSS implementation of a Java platform.
  • Oracle convinced IBM to line up behind OpenJDK and the JCP, abandoning Apache Harmony.
  • Oracle have allowed all the independents on the JCP EC, Doug Lea et al., to find themselves having to resign their positions or compromise their beliefs.
  • Oracle has done a U-turn in its attitude towards the Java TCK and the "field of use" issue that was a war between Sun and Apache Software Foundation (ASF). Oracle originally sided with ASF, but has now switched sides to the Sun (aka Oracle America) side, strengthened its position (hypocritically) and engineered a position where ASF had to resign its place on the JCP EC.

It is no wonder that the FOSS community feel that Oracle is undermining the Java ecosystem. Especially after what they did to the OpenSolaris community; forcing the independent organization to quit and hand back all rights to Oracle. Oracle though is not entirely stupid, even as an organization as big as it is. So what is the plan? Well actually it seems fairly obvious. They aim to make proprietary all the technologies they bought when they bought Sun. Solaris has already been re-proprieterized, at least in effect, if not in legal terms. The entire Java platform seems to be next on the list.

Even though Java is licenced under the GPL, all the patents and trademarks are held by Oracle. Anyone can fork the codebase but no-one can create an implementation of Java without Oracle's consent. Even though the codebase can be forked, that fork cannot be marketed in any way, shape, or form as Java. When Java was a Sun thing, Oracle were entirely happy to freeload on all the volunteers and Sun staff work. Now they own Java they want to ensure they protect and increase the income stream generated by the Java platform.

In this light, almost all of Oracle's actions can be seen as contributing positively to the overall goal. In the end all they care about is generating income and making a profit. They see a proprietary Java as an integral part of that. They cannot remove the GPL licence on the codebase, but they can control the way in which the codebase is managed and evolved. Removing all the independents from the JCP EC makes sense in this light. Oracle wants everyone involved in the decision making associated with Java to be fully paid up licencees so that relationships are based on financial transactions and income generation.

Oracle can though make mistakes, look for instance at the situation over Hudson. Oracle tried to make a "land grab" for the codebase and the community in the face of the core development team deciding to take the whole project away for Oracle territory. Oracle claimed that even if a fork was made, it could not be labelled Hudson as Oracle held all the trademarks. In fact they didn't, this was an outright bluff. But at the same time they sent their lawyers out around the world to register trademarks to try and cover the lack of validity of their claim.

Now whilst the vocal elements of the FOSS community are up in arms, will the overall Java ecosystem care? Unlikely. Most organization who use Java don't actually care who controls it as long as it works. If Oracle start charging more for Java, or any part of the overall ecosystem, then organization will pay up as a cost of business and transfer the costs on to the customers. Is this the end of Java? Highly unlikely. Will all the fuss die down during the lead up to the release of Java 7. Almost certainly. Did Oracle win the recent battle with ASF? Not really. Will Oracle win the Java War? With IBM onside, yes they will. Is it sad they way they are doing it. Definitely.

Parallelism can be Groovy

I will be doing a session 2011-01-13T18:00+00:00 for the BCS Advanced Programming Specialist Group entitled "Parallelism can be Groovy". It is not a coincidence that the title is exactly the same as the one for a "webinar" that the BCS Distributed and Scalable Computing Speciailist Group tried to put on with me as presenter 2010-03-17T17:30+00:00, but which didn't actually work (for various reasons). This new session isn't of the exactly the same material - the world has moved on so much in the last nine months! - but it is about the same ideas.


Traditionally parallel programming has been seen as the preserve of C, C++ and Fortran. Tools such as MPI and OpenMP have become the de facto standard tools. This mindset has come out of the HPC milieu and the HPC community seem not to be looking for any other options.

As the Multicore Revolution has meant that every computer is now a parallel processing computer, all programming is becoming parallel programming. In the worlds of desktops and web services the main programming languages are Java and Python. These languages have incorporated a threads model since their inception. So successful has these been till now, even C++ is getting a standard threads model. MPI and OpenMP have been tried in the Java, Python and C# community, but have basically failed to work. They are tools likely only ever to be used in HPC with C, C++ and Fortran.

The problem of threads use is that it introduces the age old problems of locking and synchronization. Issues that the average programmer finds hard, very hard. Often impossibly hard. What is the root of the problem? Shared memory multi-threading. What is the answer? Don't use shared memory multi- threading! Obvious really.

So what are the alternatives? Actually they have been known for a long while: the Actor Model, dataflow, and communicating sequential processes (CSP). There is also the more recent, but still well known, idea of software transactional memory.

In the Java milieu, Scala has opted to highlight the Actor Model as the principle tool of concurrency and parallelism. In the Python milieu there is a move to providing support for lightweight processes via the multiprocessing package.

There are layers of language here:

  1. Native, statically typed: C, C++, Fortran
  2. Virtual machine, statically typed: Java, Scala
  3. Virtual machine, dynamically typed: Python, Groovy, Clojure

What is becoming clear, is that the dynamically typed languages have an increasingly strong role to play as "coordination languages". These languages have meta-object protocols which means that creating domain specific languages is very straightforward.

Taking Groovy as an example, the GPars project is creating a language for describing the structure and management of parallel computations on the JVM, both locally (utilizing the multi-threaded nature of the JVM), and also cluster-like (utilizing the latest NIO-based distributed systems infrastructure). Initially focused on providing Actor Model infrastructure for Java/Groovy systems, attention is now turning to dataflow and especially to CSP.

The presentation will focus on demonstrating the role that dynamically typed language have in the future of parallel computation.


ACCU London Meeting 2010-11-18

Last evening I did a presentation, "Java, Python, Ruby, Linux, Windows are all Doomed" for the ACCU London people at Skills Matter's place in Barbican. As always they videod the session. I believe the vdeo will appear here as soon as it has been processed. If you want to peruse the slides they are here.

Armistice Day

Today is Armistice Day. We should all actively remember what this means - it is about the cessation of hostilities, an end to killing and destruction. Sadly far too many people in the world think fighting and war is a good thing.

Google should have done a much, much better job world wide of celebrating what today is.

Go Go

Apparently yesterday, 2010-11-10, was the first anniversary of the existence of the Go programming language. Whether it is actually true or not doesn't matter, Rob Pike, the leader of the Go team, has declared by an email on the email list, that this is a fact!

Go has definitely exhibited "phenomenon" status: in an incredibly short time it has evolved into something very useful and usable, and it has created a "buzz" not associated with an imperative programming language since 1995 and the rise of Java.

Go seems to show the possibility that it might quickly topple C as the "compiled to native code" programming language of choice. The fact that some of the original developers of C and UNIX are a part of the Go team gives added credence to the view that Go will replace C.

The core issue here is multicore. The way processor architectures are evolving, moving more and more toward distributed memory, leads to a problem for programming languages such as C, and indeed C++. Whilst C++ (via the C++0x standard) now has threads as part of the language, there is a built in assumption that all the processors available to the program are fundamentally the same. With current multicore devices this tends to be true. The future though is heading much more towards heterogeneous multicore devices: chips with many processors, but with a multiplicity of architectures. Certainly libraries can help C and C++ deal with such systems, and indeed this is one of the reasons for OpenCL. However OpenCL is really just an attempt to patch things up so that C and C++ can continue to be used, it isn't really a solution to the problem.

Increasingly, we are seeing moves to revive long forgotten software architectures: Actor Model, CSP, Dataflow Model. This is a return to process- based rather than thread-based thinking. On the JVM, Scala is heading the Actor Model route. Erlang has been using Actor Model for many years. In the native code arena, there are CSP implementations for C and C++. Go is a core part of this trend. Although developed independently of CSP, the process and channel model of Go, evolved via the Newsqueak, Alef and Limbo programming languages that preceded Go, is very similar to that of CSP. The somewhat twee naming of "goroutines" doesn't detract from the fact that they are a very usable tool for creating process-based, parallel systems.

Go seems to already have what C would need to have to be able to cope with the hardware devices that will soon be with us.

Bing Bang Mark II

Commuters in London never have to buy a newspaper for their morning and evening journeys. In the mornings there is the free Metro and in the evening the free Evening Standard. Generally both have some news, some sport reporting, masses of gossip, and lots of adverts. Ignoring all the advertising and the gossip stuff - I suppose someone must find it interesting and readable, or it would grow to be more and more of the non-advert content - the news and non-football sport news pass about 10 mins of reading time.

Yesterday though I was shocked, but pleasantly so. "Bing Bang Mark II" was the front page headline, there was a superb picture from CERN showing the debris from an extremely high energy collision at the Large Hadron Collider (LHC), and the short one-column article was not only not entirely wrong, it was also not too over the top, nor overly condescending. Crikey, sane reporting by a newspaper about the LHC.

I don't know why the sudden re-emergence of interest in LHC activity, but well done Metro for doing something interesting about it.

Java, Python, Ruby, Linux, Windows, are all doomed.

The ACCU London folk asked me to do a session for them on the evening of 2010-11-18 - I am not sure of the start time, I guess 18:30 or some such. It will be at Skills Matter's place just north of Barbican. The title is as the title of this blog entry and the summary sent out for the session is:


The Multicore Revolution gathers pace. Moore's Law remains in force - chips are getting more and more transistors on a quarterly basis. Intel are now out and about touting the "many core chip". The 80-core chip continues its role as research tool. The 48-core chip is now actively driving production engineering. Heterogeneity not homogeneity is the new "in" architecture.

Where Intel research, AMD and others cannot be far behind.

The virtual machine based architectures of the 1990s, Python, Ruby and Java, currently cannot cope with the new hardware architectures. Indeed Linux and Windows cannot cope with the new hardware architectures either.

So either we will have lots of hardware which the software cannot cope with, or . . .

. . . well you'll just have to come to the session.


Patently Ludicrous

A lot of focus in the software community over the last few years and (sadly) for a few more to come is on software patents - and quite right too, they are an important tool for big corporates to stifle innovation and competition. One case that is causing some angst recently is variously known as "Paul Allen vs. The World", "Paul Allen vs. The Internet", and "Paul Allen vs. All". In fact it is a suit against 11 companies: AOL, Apple, eBay, Facebook, Google, Netflix, Office Depot, OfficeMax, Staples, Yahoo, and YouTube. A seriously motley crew. This case has the potential to bring the whole patent troll approach to business in the software arena to an end. Why? Well the lawyers (and I am not one) seem to be indicating there are serious faults in bringing a single case against so many independent and disconnected defendants. Also the claims in the suit are extremely non-specific, as are the patents themselves. From the press release issued by Interval Licensing LLC itself:

_ The patents Interval is asserting include:

  1. United States Patent No. 6,263,507 issued for an invention entitled "Browser for Use in Navigating a Body of Information, With Particular Application to Browsing Information Represented By Audiovisual Data."
  2. United States Patent No. 6,034,652 issued for an invention entitled "Attention Manager for Occupying the Peripheral Attention of a Person in the Vicinity of a Display Device."
  3. United States Patent No. 6,788,314 issued for an invention entitled "Attention Manager for Occupying the Peripheral Attention of a Person in the Vicinity of a Display Device."
  4. United States Patent No. 6,757,682 issued for an invention entitled "Alerting Users to Items of Current Interest." _

The perception given by this list is definitely that they are claiming patent on Web browsers, Web browsing, advertising on screen, and pop-ups. Obviously, there must be a bit more detail. Fortunately, the patent documentation is stored by Google Patents.

From cursory examination, 6,263,507 (filed 1999-06-25, granted 2005-04-12) appears to be about screen design for news reading "mash-ups". There seems to be no mention of task modelling, user modelling or any HCI approaches that have been standard for 30+years, but then this is a US patent and so nothing to do with usability, it's about profit.

Bizarrely 6,034,652 (filed 1996-03-22, granted 2000-03-07) and 6,788,314 (filed 2000-03-20, granted 2004-09-07) seem actually to be the same. The abstracts are suspiciously the same - except that people have been replaced by users. Even in my role as "not a lawyer", it strikes me that USPTO have granted the latter without even looking at the former, and are therefore acting improperly - look at the filing and granting dates. Nonetheless these vacuous patents are on the books. They are again seemingly about newsreader "mash-ups", but the content of the text indicates something obvious and not really an invention.

I bet you already guessed that 6,757,682 (filed 2000-09-07, granted 2004-06-29) is about newsreader "mash-ups". This time they explicitly introduce the existence of "The Cloud" in their diagrams. Interestingly one of their diagrams looks very like the one in 7,028,023 (filed 2002-09-26, granted 2006-04-11) Linked List. I wonder if they will be paying royalties or are cross-licencing? 6,757,682 actually has some maths in it, so they have some model for processing, but isn't mathematics explicitly not patentable?

It is interesting to note that there is a suspicious linearity to the granting and filing of these patents. It would be interesting to know who the "inventors" were working for when creating these documents - which to be honest bear all the hallmarks of documents designed to undertake patent trolling. The approaches are obvious, couched in terms the USPTO can issue a patent against.

Clearly the lawyers see a gravy train and a gold mine hear and are milking it. Where does that leave UI designers? Well if these sort of patents are upheld as legal, then they are well screwed.

On Hardware and Software Procurement

Yesterday I read the article by Gijs Hillenius gvt-must-stop-breaking-procurement-rules-and-move-to-open-source and it got me thinking about the situation in the UK. Effectively the question is: is it appropriate for desktop and laptop products to be procured as hardware/software combinations or should the hardware and software be procured separately?

In the early days of computers (*) computers came with operating systems provided by the hardware manufacturer. End users purchased a bundle of hardware and software from a single manufacturer, with different manufacturers offering different products in terms of operating system as well as hardware. Procurement consisted of requiring the manufacturers to tender their bundle, and for the purchaser to make a choice of the best bundle for them. Software as well as hardware was a variable in this process.

Then we entered the Unix era, hardware and software, especially operating system, started to have much more separated existences. End users could choose from the old hardware/software bundles available or they could choose to run Unix and buy hardware to run it on. This widened end user choice and so added more parameters to the procurement process. Software and hardware were still factors in procurement. Of course, manufacturers didn't like the commoditization of hardware that the existence of a portable operating system provided and caused Unix to be splintered into many versions and hastened a return to hardware specific operating systems, and a return to the purchase of a hardware and associated software as a bundle. Despite this end users still had software as a factor, software as well as hardware was a factor in procurement.

With the rise of the personal computer a new opportunity for separating hardware and software arose. The operating system in this case was Windows. Windows rose as an operating system that was portable across many personal computers, it commoditized hardware. However, procurement did not gain any new parameters. Quite the opposite. Due to the amazing contracts signed by the Gates' on behalf of Microsoft, all personal computer manufacturers only offered Windows as the operating system. Worse hardware manufacturers seems to have capitulated on the commoditization of hardware rather than fight it as they did with Unix, and just tow the line and supply Windows bundled. Procurement was of a hardware/software bundle but it was always the same software no matter what the hardware. End user choice has been eradicated.

What is arguably wrong here is that after 20 years, procurers are no longer sensitive that software used to be a variable. They have been fooled by the personal computer hardware/software bundling into the mind set that hardware is the only variable, that getting Windows is a fixed thing. There has thus been no challenge to the procurement process being referred to in Hillenius' article.

The real issue for procurers now is longevity of workflows and access to data. Hardware and operating system are essentially irrelevant. It is all about people being able to read and amend documents over periods of years -- where I use document to mean any file of data, not just letters, papers, blurb, etc. One solution is to continue with the model of "Windows as the only platform to consider". Another solution is to ensure documents are stored in a format for which there is an international standard. Standard formats admit multiple suppliers and hence a market. Competition is supposed to be a cornerstone of business in EU and USA (but perhaps not across the whole world). Currently there are multiple different applications for manipulating documents, we do have some competition. However the procurement processes appear not to have accepted that there are multiple suppliers of products for creating and amending documents.

So despite having separated hardware and software by having a portable operating system, procurement people continue to purchase combined hardware with Windows as software bundles. Now that Linux has created a viable alternative to Windows as a portable operating system on workstations, there is choice. However, this choice is not present in most procurements that are undertaken. Why aren't tenderers allowed to offer hardware/Linux as an option compared to hardware/Windows. Corporate culture seems to have bought in totally to Microsoft's desire: Windows is the only operating system.

Hillenius' article is challenging this status quo by asking: is the status quo illegal given the rules of procurement? Portugal and UK are different jusridictions so in principle the answer could be different for the two countries. On the other hand both are members of the EU and so there should be significant commonality. I am not a lawyer so cannot comment on the legal position, but as a voter I think it ought to be illegal for all government procurements to be founded on the assumption that Windows will be the operating system. Tenderers should be allowed to offer any combination that allows the workflows to be fundamentally unchanged and for there to be continuity of access and amendment of documents. Hardware/Linux and hardware/Windows should be seen as competitors and the competition should be part of the procurement.

No matter how much hype and FUD Microsoft and others put out, the total cost of ownership of Linux is significantly lower than that of Windows. Whilst Linux is free of charge, it doesn't mean there is zero cost of ownership. There are issues of deployment, maintenance, etc. all of which involve people, and there lies cost. Windows though requires more maintenance, and has a significant purchase price, so therefore has a higher cost of ownership. Moreover, malware on Windows involves installing defensive software that invariable does not come bundled with the operating system but is separate, thereby increasing the cost of ownership hugely. In these days of financial austerity, wouldn't a FOSS software solution help the country's finances? So even ignoring the legal situation, surely the financial situation means that hardware/Linux should be admitted to the procurement processes where feasible.

(*) _ Of course there were even earlier days when computers didn't have operating systems; operators loaded user programs direct onto the hardware and they ran to completion to be unloaded by operators. What is worse, those computers were significantly less powerful that today's average smartphone. _

Copyright © 2017 Russel Winder -