ComputingHistoryMatters
In a few milliconds my brain made the connection and I had one of tho “so that is where it came from!” moments.
When I was in Silicon Valley last week, I had the privilege of taking a tour of the Computer History Muum. One of the stops was in front of the hanging grids:keke palmer
I had no idea what they were. My friendly tour guide, Fred Ware, told me that I was looking at “core memory.” It was an early form of memory technology that ud small holes, or corestos, through which wires were inrted. This tup allowed the machine to control the polarity of the magnetic field created and thereby store data inside of them.机器视觉培训
When I first started programming on a Unix machine in college, I would occasionally do something stupid like try to peek at the contents at memory address zero which would do bad things, cau a crash, and give me a “core dump.” I had always assumed that the “core” referred to in the error message was an adjective for “main,” but when I saw real “core memory,” I realized that I had misunderstood the term for years.
Our industry is full of terms that em awkward unless you know their history. Even basic things like the “print” method in almost all languages doesn’t make much n unless you e that it goes back to the days when you connected to a computer via a TeleTYpe (TTY) machine that printed its result on rolled paper.
This isn’t unique to computing. Human languages have quite a bit of history in their words. For example, the word “salary” comes from the bygone days when you were paid for your labor in salt. The difference with computing history is that most of it happened recently and most of the people that created it are still alive.
Great stories are just below the surface of things you u each day. While I knew that ffoLinux is a Unix-like operating system that was started by Linus Torvalds when he was a student the University of Helsinki, I didn’t realize until last week the interesting story of “Unix” itlf. The name probably came about as a result of a silly pun. It was a “castrated” version of MULTICS. The MULTICS operating system was developed in the mid 1960’s to be a better time-sharing operating system for the expensive computer time of the day. Pe
rhaps frustrated with the complexity of MUTLICS, decompoKen Thompson wrote a rough “simplified” version of it in a month. Thus, the story goes, it was “UNIplexed” and simpler where MULTICS was “MULTiplexed” and complicated.
Unix’s history gives color to the people that created it. I’m certainly no Ken, but I can relate to the feeling that some code ems overly complex and feel the itch to rewrite it to something simpler. It’s neat to e that his diversion worked out so well. It’s also a testament to the people behind MULTICS that most of the central ideas in our “modern” operating systems trace their origin to it.
Programming languages tend to have a story as well. Usually the most interesting is the philosophy that drove their creation. A good reprentative sample is the philosophical gap between Simula替代率 and C++.
Simula is regarded as the first object-oriented programming language. It too was developed in the golden era of Computer Science in the 1960’s. As stated by its creators:wilsonart
“From the very outt SIMULA was regarded as a system description language”
This naturally led to the bold design objective #6:
“It should be problem-oriented and not computer-oriented, even if this implies an appreciable increa in the amount of work which has to be done by the computer.”
It was so bold, that they had to tone it down a bit. They still had 1960’s hardware after all:
“Another reason for the [de-emphasizing the above goal] was that we realized that the success of SIMULA would, regardless of our insistence on the importance of problem orientation, to a large to a large extent depend upon its compile and run time efficiency as a programming language.”
Regardless, it’s telling of its philosophy. As David West writes:
kites
“Both Parnas and the SIMULA team point to an important principle. Decomposition into subunits is necessary before we can understand, model, and build software components. If that decomposition is bad on a ‘natural’ partitioning of the domain, the resultant models and software components will be significantly simpler to implement and will, almo
st as a side effect, promote other objectives such as operational efficiency and communication elegance. If instead, decomposition is bad on ‘artificial,’ or computer-derived, abstractions such as memory structures, operations, or functions (as a package of operations), the opposite results will accrue.”
你问我答2As David continues, the philosophy of Simula was:
”… to make it easier to describe natural systems and simulate them in software, even if that meant the computer had to do more work.”
Bjarne Stroustrup took a different approach with C++:
“SIMULA’s class-bad type system was a huge plus, but its run-time performance was hopeless:
The poor runtime characteristics were a function of the language and its implementation. The overhead problems were fundamental to SIMULA and could not be remedied. The cost aro from veral language features and their interactions: run-time type checkin
trip什么意思g, guaranteed initialization of variables, concurrency support, and garbage collection…” (Emphasis added)
“C with Class [precursor to C++] was explicitly designed to allow better organization of programs; ‘computation’ was considered a problem solved by C. I was very concerned that improved program structure was not achieved at the expen of run-time overhead compared to C.” (Emphasis added)