by David Davies April 2020 (F867)

Table of Contents

  1. Early days (before 1969)
  2. Ken Mitchell (Mitch) 1969
  3. Design in Brittle Materials
  4. Electronic calculator (~1969)
  5. Dr Duckworth (1969)
  6. Research Planning Diagrams (1969 onwards)
  7. Time sheet analysis (as at 1969)
  8. Computing by time sharing (1970 to 1976)
  9. A mini-computer at Fulmer (1976)
  10. Pocket calculators (from 1970s on)
  11. Microcomputers (1980s)
  12. People
  13. Mathematical modelling
  14. Analysis with Uncertain Quantities (AUQ)
  15. Epilogue

Introduction

This article is an account of the development of calculating and computing at Stoke Poges. It is almost wholly based on my personal recollection with very little documentary evidence. Please forgive omissions in areas of work I was not aware of and errors of timing in those I was. As will become apparent, 1969 was a turning point in this development.

Early days (before 1969)

In the early days, most of the work at Fulmer made fairly modest computational demands. A notable exception was X-ray crystallography. The interpretation of both single-crystal and powder photographs must have required a large amount of numerical integration although I don’t know what help was available in the form of tabulated functions. Rex Waghorne was a professional mathematician working in the physics department and I have a vague recollection of him telling me that he had hired time on a computer at the Wexham Springs laboratory of the Cement and Concrete Association, very near to Fulmer.

Technical staff had slide rules and, where accuracy was needed, everyone was adept in the use of tables of logarithms and of trig functions. The library could provide you with tables of less usual functions, statistical functions and with tables of integrals.

Brunsviga calculator

In calorimetry it was necessary to evaluate the area under a temperature/time curve by Simpson’s rule. For this and many other calculations people in the physical chemistry department used a Brunsviga manually operated calculating machine.

 

 

 

 

 

 

On the administration side, accounts were prepared manually using adding machines. At Fulmer, as at many organizations offering a professional service, clients were charged for the time spent on their projects. Every member of the technical staff, including technical managers, had to complete a weekly time-sheet allocating the 37.5 hours to the projects worked on. Filling in your time sheet was not a popular job. There was constant pressure from management to charge as much time as possible to rechargeable projects. Where one department assisted with work for a project in a different department, hours chargeable were subject to negotiation. Many people were dilatory in submitting their time sheets. Analysis of the time sheets was done manually in the cashier’s office and was slow, even when the time-sheets were in.

Reports were typed on stencils for duplication on a Roneo machine. Correspondence was typed on sets of paper interleaved with carbon paper to give coloured filing copies. Typists and secretaries used manual typewriters until the mid 1960s when almost all moved on to IBM Selectric “golf-ball” typewriters[1].

Ken Mitchell (Mitch) 1969

Ken Mitchell headed Fulmer’s engineering and mechanical testing department and had done so since 1954 or before. Sadly, suddenly and completely unexpectedly, he died of a heart attack. I believe this was in early 1969.

Design in Brittle Materials

At the time, unknown to me, Mitch was carrying out an important project for the Admiralty Materials Laboratory at Holton Heath near Poole. The project aimed to create guidelines for the design of engineering components made from fully brittle materials. This contract was immediately assigned to me. When I took over Mitch’s notes I found they consisted of tabulated test results of hundreds of bend tests on rectangular bars of reaction-sintered silicon nitride together with a few equations that I didn’t understand.

Most material objects contain flaws which, under loading, give rise to stress concentrations. The trouble with brittle materials is that unlike ductile materials they have no means of dissipating these local high stresses and therefore tend to fail at the most severe flaw. Characterization of a brittle material therefore requires establishing the whole distribution of its strength. It’s not enough to know mean and standard deviation; we need the extremes.

MADAS 20AZS calculator (attr: MaltaGC [CC BY-SA])

Studying Mitch’s notes it became clear that he had made some progress with fitting a Weibull distribution to his test results using laborious numerical methods.

I also discovered that he had a Madas electromechanical calculator. And I found myself taking this machine home in the evenings to work on Mitch’s data. This was no joke; it weighed 16 kg!

Graph paper for Weibull analysis (Z326 © Fulmer Research Institute)

Graphical methods helped. I produced some special graph paper for Weibull analysis of strength test data. If you had a set of strength results the method was to sort them into ascending order and plot them on this paper. If they fell on a vertical line, this showed that they followed a Weibull distribution. The position of the line gave an estimate of the Weibull modulus of the material. This modulus is a dimensionless number; the higher the Weibull modulus, the more uniform is the strength of the material.

 

 

 

 

 

 

 

ANITA Mk VIII calculator (attr: MaltaGC [CC BY-SA])

Electronic calculator (~1969)

In the process metallurgy department we bought an ANITA Mk VIII calculator, the world’s first all-electronic desktop calculator. This became widely used for applications such as calculating charge sheets for alloy melts.

Dr Duckworth (1969)

In June 1969, on Mr Liddiard’s retirement, Dr Duckworth took over as Director of Research.

Research Planning Diagrams (1969 onwards)

Dr Duckworth came with many ideas on how to improve Fulmer’s organization and management; one of these was the project management technique of Critical Path Method (CPM). We were encouraged to use CPM in planning our respective projects based on modelling them as activity network diagrams. In essence CPM is a tool for scheduling the individual tasks required to complete a project in such a way that delay in overall completion is minimized. CPM or something like it is now used for almost all major construction and many other projects.

The problem with using CPM for R & D projects is that these are full of questions such as “what will happen?” and “what shall we do then?”. I suggested that such projects would be better modelled by computer programming flow-charts than by network diagrams and I went on to develop the Research Planning Diagram (RPD) method for planning projects under uncertainty.

Time sheet analysis (as at 1969)

Dr Duckworth was not content with the situation he found – rudimentary monthly management reports, out of date on arrival.

Computing by time sharing (1970 to 1976)

We clearly needed computing facilities. At this time (1970) computers were still prohibitively expensive and far beyond Fulmer’s budget. However, time-sharing was beginning to become available as a service. Time-sharing was a way of sharing the very expensive resources of a computer between many users so that while one task is paused, waiting for input for example, it can be suspended while another task takes over use of the processor. This process had been invented in the 1950’s but was not practicable until the size of memory and the speed of data transfer had increased sufficiently.

We equipped ourselves with the necessary hardware and I learned the language BASIC.

Analogue modem – acoustic coupler (attr: secretlondon123 CC BY-SA 2.0)

Access to the computer was via a normal speech telephone line using an acoustically coupled modem at 10 characters per second. Compare this with today’s broadband speeds which are hundreds of thousands times faster.

Teletype Model 33 ASR (attr: ArnoldReinhold, SR-A33 at CHM.agr, CC BY-SA 3.0)

The terminal was a Teletype Model 33 ASR teleprinter. There was no screen. Both input and output were recorded on a roll of paper. At the left of the machine was an eight bit paper-tape reader/punch. This enabled one to prepare programs and data off-line in advance of a session and to output results in a permanent format.

The machine was hated by trained typists because of the noise and the radically different feel of the keyboard.

A computing session with this setup could be a nerve-wracking affair. You incurred charges for the amount of data transmitted and for the amount of CPU time used as well as the normal telephone charge. In the early days we used a time-sharing service running on a Honeywell 1648 computer in the USA so we were paying for a transatlantic call. If you set a large task running, the terminal just sat there in silence. You had no idea whether anything was happening, whether the telephone line had dropped or whether your program had entered an infinite loop. You were mightily relieved when the machine finally clattered into life.

When we acquired computing facilities, I wrote BASIC programs to do Weibull analysis of strength data. I also wrote programs to analyse the time-sheets. We started to produce management reports showing a breakdown by project, by department and by person. This suite of programs was later ported by Neil Kennedy to dBase and was greatly extended to include detailed forecasts. Eventually about 30 management reports were produced fairly promptly for each four-week period.

For RPD I wrote a simulation language that could represent an RPD project plan and generate a BASIC program from it. Running this program would then give probabilities for the various possible outcomes and probability distributions for project duration and cost. This gave a great deal of insight but was never wholly successful. It was all too easy to formulate an RPD plan that would result in a simulation model that would not run or would run for ever!

For Dr Gross we produced tabulations of functions which he had derived from his thermodynamic calculations. This saved lot of manual computation. For Grev Brook we did curve fitting.

A mini-computer at Fulmer (1976)

By 1976 it was clear that we had an established need for computing facilities and also that we were spending an unsustainable amount of money on time-sharing. At the same time, there was a growing number of smaller computers (mini-computers) being sold at more affordable prices. A very persuasive IBM salesman tried to sell me an IBM 5100 Desktop machine. This would do most of what we had been doing by time-sharing but was expensive for what it was and was obviously unsuitable as a basis for future development. I was very glad that we resisted his blandishments though my wife Gill and I did enjoy his kind hospitality at the Royal Opera House.

Interdata 7/32 mini-computer (attr: https://en.wikipedia.org/wiki/Interdata_7/32_and_8/32#/media/File:Living_Computer_Museum_IMG_0002_(9636198071).jpg. CC BY-SA-3.0)

We took advice from a local computer consultant and bought a Perkin-Elmer Interdata 7/32 mini computer, largely because it was the only mini-computer at the time with 32 bit general registers and was therefore capable of directly addressing the whole memory of the machine.

I can’t remember the hardware specification of this machine but I believe it had 64kB of core memory, a half inch tape drive and four 1 MB hard disk drives, two of which were IBM 2315 removable cartridges.

It had a console with a keyboard and screen, and reader-punches for paper tape and for punched cards. It was easy to hook up the Teletype to talk to the Interdata. We also bought a Potter line-printer which printed 600 lines per minute, I think. This was housed in an acoustic cabinet which deadened some of its fearsome row.

The operating system for the 7/32 was Interdata’s OS/32. This allowed multitasking, not by time-sharing but by segmenting the memory. A compiler was available for Fortran IV and an interpreter for BASIC.

Part of the package was training. Neil Kennedy and I went on a two day course – a day on operations and a day on assembly language. This was the only computer training I’ve ever had and I found it too rushed to be of much use to me.

We installed the computer on the top floor of the Yarsley building in the south-west corner offices that were later converted into the conference room. We provided the room with air-conditioning.

Before long we had managed to port all our time-sharing work to the Interdata.

After a few years space at Stoke Poges became available as Yarsley work moved to Redhill. Among a series of laboratory moves, the computer lab was relocated to the top floor of the main house at the south west corner. The room had previously been a physical chemistry lab. Additional hardware included an IBM 2741 terminal (like a golf-ball typewriter) which could produce high quality printed output.

At this time we were lucky that the Unix operating system had just been ported to the Interdata 7/32 by an Australian university[2] and Perkin Elmer had made it available. Unix was installed on our machine by an amazingly skilled operator[3] He questioned us about which options we wanted while typing at the console what were then to us totally incomprehensible commands at speeds that would shame many skilled copy typists. From then on almost all our work was done under Unix. Today Unix and Unix-like operating systems such as Linux, android and macOS are widely used on hardware from super-computers to smart watches and most of the servers on the internet.

Our owners, the Institute of Physics (IoP) gave us a contract to maintain their membership records. Using some rather Heath-Robinson low level programming, Neil Kennedy managed to squeeze the Institute of Physics membership data onto a disk cartridge and we maintained the IoP membership records for two or three years until the data outgrew our facilities and the IoP decided to move the contract to a specialist bureau.

Hewlett Packard HP65 calculator (attr: © David Davies)

Pocket calculators (from 1970s on)

During the 1970s, the development of integrated circuits led to miniaturisation of electronic devices including calculators. Two popular choices of scientific calculator at Fulmer were the Texas Instruments TI30 and the Hewlett Packard HP35. The latter was said to be the result of Bill Hewlett’s challenge to his co-workers to produce a calculator that could fit into his shirt pocket.

In 1969 on my second visit to Pakistan as part of the UNIDO contract to set up a Metals advisory Service, I chose to take with me an HP 65 calculator – a development of the HP35 that could be programmed, with the programs stored on magnetic strips. This powerful and compact machine caused some amazement among Pakistan colleagues.

Microcomputers (1980s)

In the late 1970s what later became known as home computers or personal computers started to appear, aimed at the hobbyist market. Early successful examples were the Commodore Pet, the Apple II and the TRS-80. In 1981 the BBC launched a major education programme based on the BBC micro[4]. By 1985 80% of UK schools had a BBC micro. This programme succeeded in that it led to a generation in which everyone knew something about computers and every technically interested teenager knew something about how to use them.

At Fulmer, with a developing interest in instrumentation and sensors, people in the Physics Department and in Bill Bowyer’s group used microcomputers to control processes and to record and process results. In particular Apple II machines started to appear in many places, usually with their covers removed. I also remember seeing a Commodore Pet. I wasn’t involved in any of this so I know nothing more about it. In the mathematics lab we used a BBC micro as a convenient way of showing colour graphical output.

In the early 1980’s the first practicable office applications appeared on microcomputers:- the spreadsheet Visicalc and the word-processor Wordstar. I remember Peter Kent demonstrating Wordstar to me on an Apple II in the solar lab. I was sceptical at the time. This was before WYSIWYG (What You See Is What You Get) was affordable; the typist had to enter control codes to change font or embolden or italicise. I could foresee strong resistance in our office.

Within a year or two word processing had moved on. To equip the office I chose Apricot microcomputers running Microsoft Word for word-processing and Multiplan for spreadsheets, running under the MS DOS operating system. These machines were not hardware compatible with the IBM PC but had superior screen resolution and had two 3.5 inch floppy disk drives instead of the old 5.25 inch drives of the PC. We were then able to issue research reports of a much superior presentation quality to that of the duplicated stencil.

In the mathematics lab we went on to acquire an Apricot Xi machine which had a hard drive. We were able to load SCO Xenix onto this machine (a version of Unix) and also an RM Fortran compiler. Our hardware was completed with a Frontier terminal which for the first time gave us facilities for high resolution colour graphics.

People

In about 1973 Neil Kennedy joined me having just graduated from Leeds university with a degree in computer science. He was recruited to support our time-sharing and then took charge of our mini-computer operations.

In the late 1970s I was increasingly being asked for services which required serious mathematical modelling beyond the maths expertise of Neil and myself. I was also developing my ideas on uncertainty analysis and I wanted to be sure that these were mathematically sound. We were extremely fortunate in being able to recruit two brilliant mathematicians.

First, in 1979, we persuaded John Denison to join us. He had a double first from Cambridge in mathematics and engineering and had been one of the pioneers of computing in this country. He had learned to program Alan Turing’s ACE Pilot Model computer at NPL and had then been part of the team at English Electric that developed DEUCE, the re-engineered version of ACE.

In the early 1980’s Simon Jones joined us from the Institute of Hydrology. He was another Cambridge first class mathematics graduate and, like John, was completely sure-footed in those many areas of applied maths where I was shaky. Simon was early in his career and saw Fulmer as an opportunity to widen his experience.

Both John and Simon were not only very skilled and effective but were very good communicators of complex concepts. I learned a lot from each of them.

Each year, through most of the 1980s, we employed a third year sixth form student after the Oxbridge scholarship exams until they started their university course in the autumn. These young people were an inspiration and almost all achieved a great deal in the six to eight months they were with us.

Mathematical modelling

A good deal of our mathematical modelling work consisted in formulating problems in terms of differential equations and solving these using finite-element, finite-difference or boundary-element packages. Among the packages used were BERSAFE[5] , ANSYS and DYNA2D. For general Maths work we used the Interchange Core Mathematics Library.

We worked on a wide range of problems. Some examples are:

        • design of aluminium window frames to minimise heat transmission
        • emulsion polymerization
        • solidification conditions in continuously-cast metals
        • prediction of rock-bursts in deep hard-rock mining
        • the design in composites of an aircraft
        • acoustic tracking to give immediate feedback in rifle ranges
        • simulation of blunt trauma to the wearers of body armour
        • radar absorbing materials.

Analysis with Uncertain Quantities (AUQ)

The quantitative analysis of RPD plans requires the subjective assessment of probabilities of events. This makes many statisticians squeamish – they are happier with probabilities supported with arguments from symmetry or from experimental frequency. I have never had such qualms. I remember a small project in the mid 1960s in which we had to determine why smokeless fuel on an open fire tends to spit hot fragments onto the carpet.

I devised a method which involved subjective assessment of the probability of the truth of rival hypotheses. Now, in the 1980s, I wanted to address the problem that all measured quantities and indeed most of the quantities that appear in our mathematical equations are subject to uncertainty. I devised a method called Analysis with Uncertain Quantities (AUQ). For any calculation of interest, the method requires the allocation of a probability distribution to each quantity. Often, there is vary little evidence on which to base the probability distribution assessment; in such cases one must use subjective judgment. The calculation is then done many times with random samples from these distributions – the Monte Carlo method.

Examples of RPD and AUQ projects were:

    • Project appraisal: the design of the cargo-length of a ship by computer
    • Project appraisal: the cutting out of shoe parts by fluid-jet cutting rather than steel knives
    • Technological forecasting: the extent of the use of hydrogen as an energy vector up to the year 2025.

In 1986 Simon and I organized and hosted the first international conference on Modelling under Uncertainty. The proceedings were published in book form by the IoP[6].
Methods similar to AUQ are now fairly widely used but in the 1980s we encountered reluctance to use the technique. A genuine problem is that, unless the subjective assessment of probabilities is done by the ultimate decision maker, AUQ is subject to political manipulation. But also, as we found out when we tried to sell its use in the city, a financial guru would rather give a point estimate that’s wrong than a probability distribution that admits a high degree of uncertainty. Discouragingly, uncertainty is usually much greater than we are happy to acknowledge.

Epilogue

I enjoyed my career at Fulmer; I made many friends, learned may things and met many interesting people. But apart from this I’ve been privileged to have witnessed at first hand the development so far of information technology – the most transformative technology since the steam locomotive.

It’s as if a friend described to me his trip on the first Stockton to Darlington rail journey and I’ve lived to see the Mallard’s record-breaking run.

 

NB Except where otherwise stated the photographs are copied from Wikimedia Commons and licensed under CC BY-SA (http://creativecommons.org/licenses/by-sa/3.0/)

 

[1]   One senior secretary rejected the new-fangled electric typewriter and stayed with her trusted manual machine until her retirement in the mid 1980s!

[2]   University of Wollongong, New South Wales.

[3]   I only met him a couple of times but I remember him to this day. He was an Iranian called Bipin Datani.

[4]   This was a development of the Acorn Atom. In 1981 I had built an Atom at home from a kit, to learn more about micro-computers.

[5]   John had been one of the original authors of BERSAFE at English Electric.

[6]   Jones, S B and Davies, D G S (Eds) (1986) Modelling under Uncertainty 1986, Institute of Physics Conference Series Number 80, Bristol and Boston: Institute of Physics, ISBN 0 85498 171 3