View unanswered posts | View active topics It is currently 23 Jun 2017 06:22



Reply to topic  [ 2 posts ] 
The theory of computers 
Author Message
Rear Admiral
Rear Admiral

Joined: 31 Dec 2008 20:59
Posts: 3345
I was watching the new series Manhattan starting yesterday...

And that got me to think in the most basic of terms about computers. My starting point was the understanding that we humans that is, over think some things from time to time.

For those whom haven't seen it, one should it is available on HULU...

But it is their references to computers that got on this track. I knew this, but hadn't pursued this line of thought too far, till now.

Computers back before ENIAC (US) and Colossus (British Empire) were humans! Typically women - highly trained in mathematics, and the use of mechanical calculators did the computations... Low volumes of data were typical, due to the fact that the computation rate was dependent upon humans. That is the greater the number of humans, the more could be done.

Fast forward. The early electronic computers -IBM 605, Univac I, and so on, were designed to handle small amounts of data... By 'small' I mean that only a few hundred lines of data fed in by the Hollerith Punched card - 80 column. That is 80 bytes at a time. So the data rate was way down there. By the mid 1960's most input systems (punched card, magnetic tape, magnetic drum and disk permitted higher data volumes. But far below what we do now. and that is the point.

The computers of TOS were designed for low data volumes. But, complex analysis of that data.

When one watches TOS, one never sees oh lets say 101,000 files listed. At best it is only a few files, most likely one. This confused me. Till now.

If one is dealing with low data volumes, this explains the behavior to a certain extent, of the TOS computers. They aren't to put it another way too multi media in nature.


10 Aug 2014 20:28
Profile
Rear Admiral
Rear Admiral

Joined: 31 Dec 2008 20:59
Posts: 3345
What I am getting at, is the idea that there has been a major cultural, with regards to how computers are to be used etc. The largest volumes of data, were in the accounting areas for business computers. But even those looked at one punched card at a time, or one file at a time(batch processing).

TOS is strictly in this line of thought.

It may be simple too simple for some to realize what this means. What it means is that the amount of data that was intended to be processed by the TOS computers we would find today to be totally inadequate for any job, realistically onboard a star ship. What it means is that the computers were used to point out anomalies to the Science Officer, and the Science officer would decide what to do about it. Only semi automatic, in nature, not automatic.

To put it another way they the creators were very handicapped, in their thought processes, due to the fact that they didn't have much to go on.

But what do I mean by low data volumes?

The IBM 360, was one of the fastest computers of its time. When they were prototyping the CAT scan imaging it took quite some time with the high volume of data to be collected and processed for a 360 to work through even one section of a CAT scan. Not a whole frame - which was the useable result, just a partial result, more than an hour....


11 Aug 2014 18:56
Profile
Display posts from previous:  Sort by  
Reply to topic   [ 2 posts ] 

Who is online

Users browsing this forum: No registered users and 0 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
cron

Forum hosting by ProphpBB | Software by phpBB | Report Abuse | Privacy