Russel Winder's Website

Is ONEIS a vision of where information management is headed?

_ Someone pointed out I had said that Jack Kreindler was at 56 Harley Street, whereas it is actually 76 Harley Street, a very different place. Correction now in place. _

Yesterday I attended the ONEIS Open Day. ONEIS is a new way of managing information created through a collaboration between Ben Summers and Jennifer Smith. Ben is a programmer and Jennifer an information management expert. Between them they have created a system which may be the harbinger of doom for the relational database as the way most people manage their information.

Saying that the demise of relational database is nigh, may seem a bit over the top. But we have the NOSQL people (MongoDB, CouchDB, Cassandra, etc.) bringing new ways of managing persistence, and now there is ONEIS which is a data store, but is neither SQL nor NOSQL. ONEIS treads another path, that of the object store. The critical thing that ONEIS brings to the persistent data store game is that of not enforcing a schema that remains static throughout the life of the repository. Instead, data and meta-data are held as objects with links. Simple, obvious, and very effective. Why is it so effective in the eyes of the current user base? Because of the attention to the detail of the user interface and user experience.

Having a domain expert be half the development team, it is nigh on impossible not to have a system that works for the domain expert. In this case the domain expert has had an excellent grasp of what works and doesn't work for a wide range of people using the system.

The presentations by the ONEIS folk were interspersed with "stories from the trenches" of people already using the system. Most of these were interesting, and certainly supported the marketing of ONEIS as the next big thing for users, but one in particular stood out as a show case for what is wrong in traditional information management systems and what is exciting about ONEIS.

Jack Kreindler, Medical Director at 76 Harley Street, a private medical practice, definitely not part of the NHS, talked about how they are using ONEIS to create a complete, useful and usable electronic medical record system. The splendid thing about their system is that it works for the patient, the medics and the practice. This is something the NHS electronic patient record system can only dream of becoming. Yet Kreindler's system is exactly what the NHS system should aspire to be.

What is so different? Well two things. First Kreindler's system has evolved from a small system that worked to a larger system that works, albeit with a very different schema to the original. Secondly, it has been evolved by the users as they use the system, i.e. it has changed in response to perceived user needs. Contrast this with the NHS system. Firstly it is a government procurement which implies massive upfront specification and huge documents, almost none of which is actually useful -- even though very large amounts of money are given by the government to the contractor. Secondly, there is an attempt to create a huge working system by direct fiat, and as John Gall has pointed out in his work on Systemantics, big upfront design, and direct creation of big systems, just never works. The government is seemingly unable to learn this simple lesson. Time and time again, huge projects are let and fail. Or is there a conspiracy theory here that the contractors have a deal with the government to extract vast amounts of our tax money in exchange for doing nothing useful?

The moral is of course that Kreindler's system looks like an excellent prototype for the NHS to pick up on and evolve, stage by stage, into a national electronic patient record system. Sadly this is highly unlikely to happen as government procurement is so entombed in waste and failure, and isn't going to change.

But let's go out on a high note. The beauty of ONEIS is that it allows the schema of the repository to change over time according the the needs and desires of the users. Relational databases just do not have a chance in the face of user evolvability to match need that the ONEIS way of doing things provides.

Big Money vs Innovation

I can't remember how exactly I got there, but I alighted on this webpage: Enough is Enough, by Fred Wilson. The article is a reaction to the ongoing cases brought by Lodsys against Apple and Google and, more terrifying, against individual applications developers. Lodsys appears to be a patent holding company that does nothing but licencing patents, i.e. it is the archetypal patent troll.

Wilson opens his article with:

_ I believe that software patents should not exist. They are a tax on innovation. And software is closer to media than it is to hardware. Patenting software is like patenting music. _

So here we have a venture capitalist (VC), i.e. a person who buys innovation for a living, saying that software patents should not exist. He is not the only one. Most VCs I know think patents are a waste of time for startups unless you want to get into the patent trolling business. If you actually want to create, market and sell innovative product, you are far better off using copyright and commercial secrecy as tools to protect IPR.

Big Money will of course object to this since patents have become an integral part of their business model. Fine if you are Big Money, useless if you are a small startup generating innovation. And everyone agrees startups are where innovation happens not in big organizations. So a patent-based business model is pro Big Money and anti startup, and hence anti-innovation.

Sadly I fear the patent system is so deeply enshrined in Big Money's view of business that it will not allow the system to be changed, no matter how much the VCs complain. To those that have I guess.

In need of consulting gigs / teleworking gigs

For reasons which are unlikely to become apparent any time soon, I find myself in need of being much less laid back about income generation -- even though I am currently involved with a pre-money startup and writing a book. I have ongoing training gigs, but I need to increase income from that level. For various reasons I need to telework rather than commute.

I have a CV (of sorts) on my personal website The short form summary is skilled and interested in: Python, Groovy, Java, Scala, D, Go, Ruby software development generally; parallel and concurrent programming in Python, Groovy/Java/GPars, Scala/Akka, Go, D, C++; SCons, Gradle, Maven and Ant build frameworks; Bazaar, Mercurial, and Git distributed version control systems; LaTeX typesetting.

My accountant tells me it is best for me to work as a sole trader, which I do under my own name or, where the people letting the contract cannot work with named individuals, under the trading name "Winder Information Systems Engineering" -- which is my sole trader trading name registered with HMRC.

If the people letting the contract require to deal with a company, or there are liability issues which preclude working as a sole trader, then I trade through It'z Interactive Ltd ( -- though formally this is dormant just now as I have not undertaken any of this sort of activity for 5 years or so.

So if you have any consulting or training needs which I can fulfil, please contact me. If you have funds for projects, but no internal resource, where a project is amenable to being undertaken on a contract basis via teleworking, I would love to hear from you.

Software Patents and the USPTO

The United States Patent and Trademark Office (USPTO) has been active again in trying to amend processes. This time it is all about "re-examination", and how to make it more streamlined. See for example and this conference - thanks to Groklaw's "Groklaw Latest News Picks" of 2011-05-29T05:55+01:00. Also of course this Groklaw article about recent events in the case colloquially known to everyone as "Paul Allen vs The World". I am sure there are many other examples.

In a sense it is great news that USPTO are looking to the processes involving patents in the US. Anything that appropriately streamlines useful processes is to be applauded. You are though waiting for the "but . . ."

The point that should be obvious is that this is all about re-examination. They are looking to the processes that involve the courts due to litigation bringing doubts as to the validity of a patent. Of course it is good to make these processes simpler and easier, but why is this the focus of attention? Surely the focus of attention should be on why there are so many patents out there that have so much doubt and uncertainty surrounding them: why are there so many (software) patents approved that then come under challenge from the courts? The obvious conclusion is that the initial examination is inadequate, that (software) patents are being granted without proper "due diligence". One has to doubt whether there is any technical examination at all undertaken by the USPTO, that they only consider the syntactic correctness of the documentation and the following of mandated processes in awarding a patent.

So the USPTO should look to its examination processes as well as its re- examination processes. There should be assessment of potential prior art. There should be assessment of validity of the claims. There should be assessment of the obviousness of the invention. Even cursory examination of the patents involved in all the high profile software patent cases currently before the courts in the US lead to the conclusion that very little notice was taken of the technical content of the application in awarding the patent.

I believe that software and processes should not be patentable since they are not product with a physicality, they are just a way of expressing ideas. However, even if such things were acceptable, the USPTO do not seem to have employed any form of proper quality control over the patents they have issued. One can only infer that they "approve and let the courts decide". Is this really a sane way of implementing monopolies approved by the state? (*) Well if you are a patent lawyer, of course it is. It is a "gravy train", a licence to print money. If you are an inventor or entrepreneur, it is definitely not, it is a massive threat, with potentially huge legal costs. Big businesses may be able to set aside monies as contingency for this, but the lone inventor, and the small and medium enterprises (SMEs) that are supposedly the darlings of the western economies certainly cannot. So the process as it is favours big businesses; it is by and for them.

I think the obvious conclusion is that USPTO should undertake a root and branch re-examination of all of their processes, especially the ones at the beginning of the cycle. The US government clearly needs to step in and ensure that this quango gets put onto a proper technical footing. Unless of course it is culpable in preserving the status quo.

(*) Doesn't it strike you as ironic that states have so much legal infrastructure to stop monopolies from existing - anti-trust laws, competition laws, etc. - and at the same time have a legal infrastructure (patents) for enforcing monopolies?

JSR166 Released into the Maven Repository

_ The original post was wrong in some fundamentally fundamental ways, so much so that I have taken the liberty of editing the content to be correct. _

In JSR166 and the Maven repository I reported that JSR166 snapshot artefacts were newly present in the Codehaus snapshots repository. In breaking news, I can report that Doug Lea (helped by Tim Peierls) has tagged the master JSR166 repository with the tag release-1.7.0 so that a non-snapshot release can be made, and artefacts appear in the main Maven repository. I have employed my Git clone of the master JSR166 repository, with its gradleBuild branch, to build artefacts and upload them. They are in the Codehaus repository now and will appear in the main Maven repository at the next synchronization.

Details of the 5 artefacts:

groupId org.codehaus.jsr166-mirror
artefactId jsr166
version 1.7.0

Each of the artefacts has source code and javadoc files as artefacts in the repository using the now standard mechanism of using the classifiers "sources" and "javadoc" - with the binary artefact having no classifier.

The jsr166, jsr166tck, and jsr166x artefacts are there for completeness, the ones people will be interested in are jsr166y and extra166y ones: jsr166y comprises all the new classes that will be standard in JDK 7, whilst extra166y comprises the classes that will appears in JDK 8.

For further details or any questions about the artefacts, contact me directly. For information about JSR166 see the project home page.

Progress don't Regress: Another Gant Release

There was a regression in the 1.9.4 release of Gant that could not be ignored. The problem has been corrected, and a new release candidate checked by the people who found the regression (thanks due to Jeff Brown and the Grails folk). There is therefore a shiny new Gant 1.9.5 release. Everything should be in place at Codehaus already and Maven will get updated as soon as the Codehaus -> Maven sync happens.

The regression whilst essentially trivial, was sufficiently fundamental that I have taken the drastic step of removing the 1.9.4 release from everywhere. This should have no effect on people using Groovy 1.7 or 1.8, they should use Gant 1.9.5. People still using Groovy 1.6 will not see a Gant 1.9.5 and will have to stay weith Gant 1.9.3. Of course people still using Groovy 1.6 should upgrade to 1.8.0 immediately and therefore not have a problem with Gant :-)

Thanks, and sorry for the hassle.

Ubuntu 11.04 Natty Narwhal, Unity, and Debian Testing

New month, new Ubuntu release pending installation, time for update chaos. Only two machines to do these days, but (thankfully) I have approx (a deb file cache server) installed on my server and the machines use it as the packages source, so (thankfully) only one download of the 2GB of packages.

Of course this upgrade is special: Ubuntu now forces you to use Unity by default. Unity is supposed to be a new way of working. My first thought on logging in after the upgrade was "good grief, this makes Mac OS X look good". On further tinkering, it seems that Unity actually is a (barely disguised) attempt to be Mac OS X on non-Apple hardware. Or perhaps it's an attempt to bring back CDE (if you remember CDE you'll get the joke, if you don't, sorry but it is funny).

Whilst I think Apple hardware is very good, I really don't like the Mac OS X user interface. Some people like it, which is fair enough; some just rave about it for reasons known only to themselves. I do not like it at all. Of course I dislike Windwoes even more, which is why I am a Linux person.

What I want of a just-logged-in user interface is a completely empty desktop with a very small transparent strip along the top with some information displays, a (very) few shortcuts, and a small, dense menu to access things I start rarely. I can get this straightforwardly with Gnome (I guess I could with KDE as well, I just haven't tried).

After five minutes with Unity, I knew I wasn't going to be a user of it. I couldn't find a way of making the icons smaller, or putting them on the right instead of the left, or stopping them disappearing on whim, or make the fonts smaller, or change the colours to something at least partly pleasing, or making the top bar transparent instead of solid colour, or have the menu system small and dense instead of huge windows of huge icons, or stop browsers opening full screen. The list goes on.

Aha, I hear you saying "but after only five minutes you haven't given it a chance". In a sense this is true, I haven't used Unity long enough to know it properly, but I have used Mac OS X for a while, so I know the basic user interface model being used - and I know I don't get on with it. Mac OS X is a fundamentally fascist system: Apple presents you with a user interface that it knows is right for you, even though they don't know each user individually. They do though give some parameters you can tweak, so there is at least some room for arranging for personal taste. Unity presents the same fascist philosophy, but without any obvious way of changing anything. So it seems to be just a sub-standard version of Mac OS X.

_ Mark, if you want to use Mac OS X buy an Apple computer, stop trying to coerce Ubuntu into being a Mac OS X look-alike. _

My summary is that Unity is for people who value glitz and triviality over the ability to actually do stuff other than display photographs, listen to music, watch videos, browse the Web, and perhaps use a word processor. I guess this is actually a large market - except that most of them now use smartphones or tablets for all their computing needs. My workflow and needs appear not to match that of Unity. OK fine, I'll just use Gnome as I always have done. On logging out and logging back in with "Ubuntu Classic" I find that all my Gnome settings have been annihilated and the user interface is not obeying the Gnome theme that all the tools say is in force. What the f###?

Given that I was actually in the middle of angst about whether to continue with Ubuntu or go back to using Debian Testing - decision made. I am really a Debian Testing user now, not an Ubuntu user. I may leave Ubuntu there as a second OS on the two laptops, or I may reclaim the disc space. Anyway, on rebooting from Ubuntu to Debian, all the Gnome setting were just fine. Phew. But . . .

. . . there is soon to be Gnome 3 and the Gnome Shell. If Gnome Shell is anything at all like Unity, then it will not be appearing on my machines. But that is for the future . . .

(*) And all the raving about how great Unity is by the Ubuntu fanbois and shills really is rather ugly, and counter-productive to the marketing of Ubuntu to the masses (who still generally use Windwoes because it comes free with the hardware - but that is a whole separate rant).

A New Gant Release

Groovy has just released version 1.8.0. This has precipitated a new release of Gant, version 1.9.4. Artefacts are in the Codehaus repository and so should appear in the Maven repository very soon now - this has probably happened by the time you are reading this, well I hope so anyway. Distributions are available from the Gant website.

Can C++ be resurgent?

So the C++0x standard has now been completed and has been submitted for final ratification. Whatever else has been changed in the standard, the single most important thing from my perspective is that C++0x standardizes its thread model. To date people have had to use add-on libraries, usually pthreads, suffering with various doubts and uncertainties due to there being no memory model. Moreover the lack of standard thread model has hindered the development of higher-level abstract models more appropriate for applications developers than shared-memory multi-threading.

Java has had a standard thread model since its inception. For the last 16 years people have been using this for applications programming and more and more coming to realize that shared-memory multi-threading is not a technology for applications programming. It is a technology that is infrastructure for creating higher level abstractions. Doug Lea in leading the JSR166 work has brought some of these to Java. Well you may have to wait till Java 8 before you get the best bits but most people are using JSR166 directly (even though it is an extra set of dependencies) or are using libraries such as GPars. In fact GPars is build on JSR166 providing even higher level abstraction such as actors, dataflow model, communicating sequential processes (CSP), and agents.

Amongst other JVM-based languages the move away from explicit use of shared- memory multi-threading is a rampant rush: Scala is using the Actor Model, there is the Akka library, Clojure is using agents - and software transactional memory though I think this is an aberration, it is a sticking plaster to try and make the shared-memory multi-threading model continue to be usable for applications programming in the face of increasing evidence that this model is not that usable and will not scale.

What about in the native code arena? Go is using what is effectively a minor variation on CSP: the process and message passing system of the goroutines is based on a single thread per process, synchronous message passing system that was developed independently of CSP but has very similar semantics. D has what is effectively Actor Model as its underlying model of concurrency and parallelism. This leaves the two major players in the native code arena: C and C++.

C is incorrigible, but then it is just a portable assembly language, so in a sense that is understandable. C perhaps should stay just as it is exactly because it is a portable assembly language, and serves a useful purpose because of it - C is an important language for writing system-level infrastructure that has to be portable. What is incomprehensible though is why applications programmers continue to use it. C is clearly far too low level a language for applications programmers to use, they should be using Python, Ruby, C++, D, Java, Groovy, Scala, Closure, . . . anything but C. In terms of concurrency and parallelism, C perhaps should stay as it is - though perhaps see if a language such as Go, with garbage collection and a standard thread/memory/concurrency/parallelism model, actually takes off as a replacement systems programming language.

Of course C++ is really just a systems programming language as well, but it does have some higher-level capabilities making it suitable for applications programming. Has D though already got to where where C++ can only hope to go in the future? Has C++ come to the "higher level abstractions" table too late. Can D gain traction in the market before C++ finally gets to the table?

It will be interesting to see if people using C++ just slide into continuing to use shared-memory multi-threading but using the standard rather than an add-in library or whether they see the light and take up libraries providing higher level abstractions such as Just::Thread Pro which provides Actor Model, Dataflow Model, and possibly agents in the future, or C++CSP2 which provides an authenticated CSP implementation for C++. Maybe instead people will begin an increasing move to using D which already has Actor Model and other higher level infrastructure in place.

C++ is definitely trailing the JVM-based languages, just now. Perhaps the new standard is an opportunity for C++ to re-enter the fold of appropriate languages? (Assuming people don't just continue with shared-memory multi- threading as an application programming technique.)

Lots of questions, currently no answers. Only time will tell. As usual.

Cameron's Red Tape Challenge Confidence Trick

After the attempt of "let's sell off publicly owned forest to our monied friends" which got u-turned due to public outcry, we now have "The Red Tape Challenge", which appears to be "let's repeal all wildlife and conservation laws so our monied friends can do what they want with all public land". In many ways this new plan is even more wide ranging and detrimental than the last one. Is Cameron's plan to wear down the UK public so that in the end there is no outcry when he simply gives all public land to his monied friends to do with whatever they wish?

What is really incomprehensible is that Vince Cable is in charge of handling this. One can only assume that the Liberal Democrats really are now just Conservatives with a yellow rosette rather than a blue one. Can anyone now take the Liberal Democrats seriously as a party independent of the Conservatives?

Cameron's strategy appears cleverly divisive. There are elements of excessive "red tape" in the system. Most of it is to do with tax law, none of it is to do with wildlife and conservation laws per se. By setting up the wildlife and conservation laws as the "baddie" holding back UK's economic recovery, Cameron is trying to create a public antipathy towards said laws even though all the arguments are sophistic and even casuistic. Cameron really cannot be allowed to get away with this huge confidence trick. There has to be a massive public outcry.

If you don't believe my rant, believe RSPB, see this.

To be honest, forget the whole AV debate, this "Red Tape Challenge" is far more serious, potentially unrecoverably damaging, and needs much more airplay to show what a despicable con attempt it is.

Copyright © 2017 Russel Winder -