In the begining there was machine code, a series of ones and zeros that got entered directly into your computer in order to make it do useful stuff, but you need to take a step back in time to see how even that worked.
The fundamental operations of a general purpose computing machine are to accept a value, perform an operation on that value, and return the result. Now the fact of the matter is that it’s complicated and tricky to get anything done with that since on this primitive level what you can mostly accomplish is turn 0s to 1s simply because they are 0s and vice versa. On the other hand, since they can evaluate the input and produce a measurable result they are theoretically capable of solving any problem that can be solved by computation (see Alan Turing).
This is incredibly tedious.
The great break through is when you can work with more than one piece of information at a time, retrive and store the results and accept multiple instructions. Both the instructions and the results are stored in what we commonly call memory and are loaded into the machine, operated on, and returned to memory.
Now there are all kinds of funny math tricks you can perform that look like addition and subtraction in a binary world as well as transformations based on values you examine, but a LOT of the instructions that even a simple computer will execute have to do with managing it’s work flow in terms of retrieving things from memory including the next action to perform.
The computer doesn’t care what its next activity is, in fact results, information yet to be processed, and instructions all look exactly the same so the prospect of writing instructions (self modifying code) based on previous processing is absurdly easy.
And you may think this is a good thing rather than a bad one because of course you want your computer to be responsive to the fact that the result of the previous operation was 4 rather than 5 but in practice a whole lot of programmers would forget just exactly they were trying to accomplish and the pointer that told the computer where to get the next instruction would end up directing it somewhere the value was not just wrong, but random.
So hardware came to make a distinction between programs and operational data, but because it is so damn useful sometimes to make an exception it’s more a guideline than a rule.
In the 50s, 60s, and 70s programming itself kind of branched between those who were mostly interested in getting the damn things to work at all and those who were more interested in practical results. This led to the rise of symbolic languages of the types most programmers today would recognize like COBOL, FORTRAN, and BASIC.
All these seminal languages had the great virtue of seeming more like real English than POP and MOV and didn’t actually care how many registers you had or the exact bit width, and the instruction pointer was not easily spoofed (though try a recursive subroutine sometime, I dare you).
COBOL is in most implemetations a strongly typed language. I remember an interminable amount of time being devoted to analyzing exactly what data would be needed to solve the problem and what types of values would be acceptable.
Because that is the BIG difference between a strongly typed and a weakly typed language. Without specifically stating that you are going to transform your data from one type to another, from numbers you can add and subtract to characters you can alphabetize for instance, a STRONGLY typed language will reject your program and refuse to work at all if you attempt to perform a disallowed operation for that type of data.
FORTRAN and BASIC are weakly typed and BASIC doesn’t even require that you pre-declare variables. It just assigns them on the fly based on what it thinks is appropriate. They both operate on them whatever way you tell them you want and if you add ‘A’ and ‘B’ unless they are previously assigned values as symbols for a memory location (variables) your result will be 131 and if you subtract ‘A’ from ‘B’, 1.
This is incredibly handy for certain types of operations.
In the early 80s there was a great debate between supporters of COBOL strongly typed programming and weakly typed programming. The standard bearers of strong typing were PASCAL and Modula 2, the champions of weak typing, BASIC and C (actually older than BASIC but not nearly as popular).
Looking back from a perspective of 30 years (and oh yes I have my news stand copy of the August 1984 Byte) I think we can declare weak typing the clear victor.
As a practical matter (and I have written and maintained hundreds of thousands of lines of code) with any of the “unstructured”, “weakly typed” languages you can be as structured as you want to be and I’m a great believer in structure. But you don’t have to, and that is the beauty part of the craft languages as opposed to the academic ones. The academic languages mitigate in favor of comprehensive analysis in advance of practical application and the craft languages…
Well, they solve problems.
Today, when I write poetry for machines, I do it in ‘C’ which has useful concepts like structures which are containers for disparate types of data and pointers which can point to anything from a value to a sub-routine.
And this big A? It stand for Anarchist.
Science Oriented Video
The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations – then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation – well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
–Sir Arthur Stanley Eddington, The Nature of the Physical World (1927)
Science News and Blogs
- NASA probe shows Pluto might have a polar ice cap, by Michael Franco, CNet
- North-south climate divide: Researchers find 200-year lag between climate events in Greenland, Antarctica, Phys.org
- Small Jurassic Dinosaur May Have Flown Without Feathers, By JOHN NOBLE WILFORD, The New York Times
- Hail and Farewell: A Robot From Earth Dies on Mercury Today, by Danielle Venton, Wired
- Robotic telescope discovers three super-Earth planetary neighbors, By Robert Sanders, UC Berkely
- Water could have been abundant in the first billion years, Phys.org
- Hot Vents on the Seafloor May Hold the Origins of Life, by Catherine Griffin, Science World Report
- Black Band Coral Disease Killing Kauai Reefs, By Rajender Bhatia, Clapway
- Overfishing Increases Threat of Fast-Growing Sponges to Endangered Corals, AZO Cleantech
- Russian Space Station Cargo Ship Is Said to Be Out of Control, By ANDREW ROTH, The New York Times
- Embedding classic MS-DOS games into tweets is now a thing, by Samuel Gibbs, The Guardian
Obligatories, News and Blogs below.
Recent Comments