We were not incredibly poor, although we probably would have been considered as such by US standards (which I'm sure were quite different at the time from Czechoslovak standards). We did have to smuggle our C64 over the Iron Curtain though, so I hope that this counts as something.
Stories like this fascinate me. (I've lived in the US my entire life so it's just completely foreign to my worldview.) What was logistically involved in getting ahold of one? And what were the "legal" alternatives?
> What was logistically involved in getting ahold of one?
Having a grandfather who left the country in 1968 for West Germany gifting me one when we visited him.
> And what were the "legal" alternatives?
Buying one at an outrageous markup in an exclusive shop. I don't remember the exact number (although I could find it out) but the price tag was something like five month of average Czechoslovak wages at the time. Apparently in the US the equivalent would have been paying $10000 for one (in 1988, mind you). Of course in Germany it cost something like 299 DM or so...
In Poland for a very long time private ownership of typewriters, fax machines, radio transmitters and computers was illegal without special permissions (>3 year prison term). You have to remember this was the time CIA was smuggling those to Poland with the help of Church. https://www.nytimes.com/1992/02/18/world/reagan-and-pope-rep...
>The report in Time adds many new details, particularly the role of the Central Intelligence Agency and the Roman Catholic Church in opening networks across which telephones, fax machines, printing presses, photocopiers, computers and intelligence information moved to Solidarity.
Personal possession anecdote from book "High-tech za żelazną kurtyną. Elektronika, komputery i systemy sterowania w PRL" (978-83-8098-094-5)
>In 1984, "Informatyka" magazine, involved in the dissemination of these machines, reported on the adventures of Mr. Przemysław, who received in April [...] a package from his brother in Toronto, containing the VIC-20 microcomputer, power supply, cassette recorder, a set of cassettes for television games and English language learning and connecting cables. The Customs Office in Gdynia refused to issue an import license, stating that it could issue [...] only if the computer was necessary for the citizen's professional or scientific work
It slowly got better in second half of the 80s. COCOM relaxed import sanctions in 1984 on low end 8bit gaming machines:
"New Media Behind the Iron Curtain: Cultural History of Video, Microcomputers and Satellite Television in Communist Poland" https://research.utu.fi/converis/getfile?id=51338894&portal=...
>The breakthrough in the domestication of computers in Poland took place in the mid-1980s, most likely between 1984 and 1986. In the global context, this might have been relatively late, but in the context of the Eastern bloc it seems that Poland was within the norm. There are two main reasons behind this chronology: one international, one local. Firstly, on an international level, the embargo on 8-bit technology was relaxed in 1984. Computers had been at the heart of the CoCom debate since the mid-1970s, but – as Mastanduno reports – it was not until July 1984 that the embargo on the most popular 8-bit microcomputers was removed, even though at the same time new restrictions were introduced regarding various telecommunications software and solutions.
In 1985 you could finally legally buy 8bit Atari in Pewex - chain of shops exclusively accepting $western currency$. Personal ownership of western currency was illegal :-) but regime was running low on foreign cash to repay international loans so they came up with this brilliant plan of opening shops where you could spend your smuggled black market money semi officially.
>Secondly, on a local level, as Kluska reports, in the autumn of 1984, the “[Polish] customs office ceased to make it difficult for citizens to import microcomputer equipment.”
In 1986 weekend computer market opened up in Warsaw in rented School building. It ran weekly uninterrupted up to ~2012 with one location change. Interview with founder https://spidersweb.pl/plus/2021/04/gielda-komputerowa-prl-la... VHS recording from 1994 https://archive.org/details/gielda-komputerowa-na-grzybowski... Official 'Polish Film Chronicle' newsreel from 1992 https://www.youtube.com/watch?v=mxQqsqqH8ao
Yep, we traded in our Atari VCS (2600) with 23 cartridges for an Atari 400 and two game cartridges (Missile Command and Pac-Man). He spent money that was in short supply to buy a 410 (tape recorder) and the BASIC cartridge. We learned to program and that made all of the difference years on.
There was a small slice of time where consumer, programmable computers were affordable to a large audience in the 80's and very early 90's. Adding to that era was the magazines that provided amazing content such as programs and news. Antic, Byte, Creative Computing, and Dr. Dobbs were the building blocks.
What I really loved about those magazines, living in a small town, was how they simultaneously showed you the variety of what was out there, mostly through the small ads, and the speculative future of the technology through the articles, while also giving a kid the ability to grow their skills Right Now in the form of printed-out programs.
Computer Shopper was the last time I welcomed the ads. You learned so much from those ads.
Lucky you! I had the BASIC cartdrige for the... Atari 2600. It'd come with a joystick in two split halves which, if I remember correctly, you had to plug in the joystick ports (so one in each port). My memory may be failing me for it was a very long time ago. I still fondly remember the first lines I drew, in colors, using BASIC. One of my very first program.
I had BASIC for the Atari 2600 (called Atari VCS at the time). It came with two controllers that had membrane keyboards. There were not enough keys for the alphabet so you had to use key modifiers to type the full A-Z and 0-9 character set. It was immensely tedious and really a bummer to use. I’m surprised you got far enough to do graphics with it. I dont recall being able to do anything except every simple text-based things like:
10 print “hello”
20 goto 10
BBC Micro for me round these parts but same idea. The 80s were a golden age for bedroom coders and the various 8-bit machines round the world launched thousands of careers. The fact that the machines came with a programming language (and even booted straight into it) gave many cause to experiment. It faded out in the 90s when the concept of what a home computer was changed.
We weren’t well off either, but my dad was able to get a decent deal when our neighbor upgraded. It was put into my room for lack of space - and the rest is history.
For sooooo long the upgrade cycle was so short that you could live quite cheaply and comfortably by inserting yourself in the right place in the chain.
In college I’d upgrade every three to six months and sell the old system off to someone for a good percentage of the new hotness.
Exact same computer, and we were poor too. I learned BASIC on it and basically never stopped coding since then. It’s strange to think about how different my life might have been without that computer.
> It’s strange to think about how different my life might have been without that computer.
Isn't this the honest truth.
Similar story. Learned Logo at school on a Vic-20. Loved it so much I taught myself Basic at Kmart by grabbing the Basic User's Guide off the shelf and typing stuff into one of the C64s on the display case. It impressed my dad so much he put one on layaway. Been programming ever since.
Same story here, but with a Tandy 100 RLX. We sold our Nintendo NES to help pay for it.
Same, except it was a C64. Many happy days and nights spent learning BASIC and later 6502 assembly with 3 metres of snow on the ground outside and pitch black by 3:30 pm.
I suspect some statistical weirdness going on in the precise formulation of the survey question.
I know for a fact 30M commodore 64s were sold in the 80s in the USA. Not all commodores, not all home computers, just the classic model C64, 30M units sold. That's in a country that used to only have 250M people, so in theory 12% of Americans as of 1990 had purchased one specific model, the C64, leaving only 3% for all other models combined, which seems very unlikely.
Some of those probably went to schools not homes, although schools were owned by Apple II in those days...
My suspicion is many of those were unused in basements and closets, or the question was phrased weirdly like "have you purchased a computer in the last three years" or "used a home computer in the last month" or they defined "home computer" to be "not an IBM (office) or Apple (school) product" or something like that.
Wikipedia says between 12m and 17m were manufactured, in total.
Trammel claimed 30m based on a remembered estimate of rough sales numbers per year, but the only estimate that’s based on objective evidence - serial number analysis - is 12.5m.
Yeah interesting. Maybe the 30M figure comes from 6502 shipments. I don't have the tab up anymore that was claiming 30M+ shipments.
Here's an interesting discussion link. Merely being on wikipedia doesn't mean its correct that site is a hive of disinfo in general:
This site even mentions the peculiar 30M figure.
I would tend to believe the linked site's serial number analysis result of exactly 12.5M. The americans did something like that to the germans in WWII, it turns out a remarkably small totally random sample of sequentially assigned serial numbers is enough to very accurately predict the highest number sold. Assuming very random sampling, which is never truly random, of course.
Doesn't change the overall outcome, however, when there's a stat that a small segment of an industry is "about" the size of what's claimed to be the entire industry, something's off in the numbers.
A mere 12.5M sold remains 5% of the entire USA population at that time, and honestly, having been there, almost everyone I knew had a PC clone or some apple product, usually a mac. The number must be larger than 10%. "a computer" was required at college ... ed.gov claims there are 19.4 M college students in the USA right now and google claims 332M people in the USA right now, so about 6% of the population are in college right now, so back in 1990 guess "around" 6% of the population was required to own a computer just to attend higher ed ... the claimed 10% seems like an incredibly low number.
when I went to school, each school usually had only one or 2 apple ][s -- but we had a whole classroom of PETs in 7th grade that got replaced by a classroom of C64 in 8th grade. I saved up and bought a TRS-80 CoCo 2, otherwise there would not have been a computer in our house.
Maybe the military made secret clusters akin to the ps2 and ps3 distributed systems they made in the 2000s
Maybe. 15% sounds about right to me. Back in 1990 computers were expensive, bulky devices, required specialized knowledge to operate (Apple's ad copy notwithstanding), and had limited use outside of a few specific domains.
Kinda amazing that nowadays anyone can go to a store and get a "super" computer that fits in their pockets in the form of a smartphone. Want to video call someone across the world? Buy one of those, setup an account for the Apple/Google store (yay walled garden), install WhatsApp or your messenger of choice, and ring ring!
Funny how many of the ads say "Call us for more information about our offerings!" instead of a "For more info: www....".
Call-out: we're also talking about mostly un-networked computers here.
1989 is the very beginning of "the Internet might be useful for general purposes by non-scientists." And incidentally, the same year BGP was dreamed up. https://en.m.wikipedia.org/wiki/The_World_(internet_service_...
So most computers were running boxed retail software with extremely limited hard drive data space (if present at all).
Only fifteen percent of American households had a computer in 1990.
People overestimate the speed of the spread of computers.
I didn't have a computer where I worked until 1994. And then, it was shared by eight people.
At my next job in 1995, I made them buy a computer for the office as a term of employment. At the time, I suggested that a laptop might be a good option, since the computer would be shared by three people. For the next two years, the sales guys made fun of me for wanting to put a computer, and for wanting one that fits on my lap.
I later heard that when I left, they sold the computer. I wonder how those blissfully computer-free sales guys are doing today.
When I worked for Westinghouse in 1996 was the first time I was in an office that has one computer per person. But not every department had computers at all. And most who did were just terminals hooked up to a Vax in accounting.
When I worked for a large regional media company in 1997, everyone had a computer. Only a couple of them had internet access, and that was only e-mail. This was during the days when so many people were getting AOL at home that it became uselessly bogged down by its own popularity.
1999: Everyone in the office had their own computer. Not everyone used them. But at least they all had internet access.
(OT: I'm sad that the macOS spell checker didn't know the word "Vax" just now)
Your comment is about business use of computers, not households. I worked at a Fortune 200 company and from 1985-1989 much of my job was helping roll out desktop computers and networking (including email) throughout our fairly large geographical territory. By 1990 I'd estimate conservatively that well over half of our office workers had their own networked PC on their desk.
I remember buying an AST 386 for my use at home, I'm guessing it cost about $4K or more in today's dollars, so it is true that business-class PCs at home were probably relatively rare at that time.
In 1995, I started replacing the WYSE VT-100 terminals at my small workplace with Pentium desktops, in order to prepare for the switch to client-server in the next ERP version. They averaged fron $2700/ea ($5100 today) data entry models, to $3500 for mine, to $5200 ($9800 today) for the mechanical engineering system. (I still have my notes.)
I know this isn’t relevant to the point about usage in 1990, but it shocks me (as somebody born in 1995) that over the course of 4 years you went from being ridiculed over demanding a computer to a computer on everyone’s desk. 4 years!
Heres something for you to consider.
I remember in the late 80s most people could not type. Typing was done by secretaries and was a specialized skill. There were (optional) for-credit classes in high school dedicated to teaching typing and nothing else.
I got a bunch of part-time temp office jobs as a teenager because I could type fast, having grown up with computers, while my friends would get jobs at supermarkets or convenience stores, etc.
The temp agencies had me do typing tests (on typewriters, not computers) before placing me. I blew them away at some ungodly WPM speed that I no longer remember. Not atypical by todays standards, I’m sure, but standards were different then.
Often I would be the only person in a small office who could type. I certainly was the only male who could type. Everyone else was female.
Imagine that today!
Calculators had a similar trajectory. I used slide rules throughout high school. I needed a calculator for college and got a TI engineering calculator for probably something like $200 in mid-70s dollars. Got a probably discontinued HP a couple of years later for probably the same amount. Not sure how long before they were ubiquitous in the general population but probably not more than 5 years or so.
That's an interesting experience, because my family got our first home computer in about 1981, the kid down the street had one, and there was one in my first grade class around the same time. From then on, they were available in every school I attended, and my (rural Oregon) highschool in the early 90s had four computer labs - for programming, typing, newspaper layout, and CAD. My friend and I were watching AcidWarp on his 386 in about 1991. I had an Amiga at that point, and it was actually a bit of a relic even though it could blow my friend's PC out of the water for certain things. Our town library had a computer system and the office where my dad worked had a Data General mainframe they called the "DG". By 1993 I had a Linux box that I was running as a BBS, and I saw HTML for the first time in the Army in 1995. Then one day I stepped off a train at a random stop in Pusan, Korea in 1996 and some dude about my age walked up and said hi, and we ended up hanging out with his friends and they showed me a Mac with a web browser...and the world was never the same again.
Different contexts. Uptake in homes was swift. I had a home computer in 1982. But my comment was about computers in offices.
Businesses change slowly. Equipment doesn't get replaced on a whim. It has to be amortized and there's tax thingies that mean business equipment lifecycles are 3 to 5 years, minimum.
How were you in 1st grade in 1981 but high school in the early 90s?
In 1990, it would have been somewhat rare for a college student to own a computer, but there were computer labs everywhere on campus and you had an email address, so it was clear they a part of modern life. They were priced in the 'used car' range so it wasn't impossible to own one.
(edit this was for a 'real' PC compatible or Mac, you could probably find C64s and etc at the flea market.)
In my freshman year in 1993, I was the only person on my floor to haul a PC into the dorms. Across 4 buildings each with a dozen floors, there was maybe ten people who had computers in their room. The only reason I had one was because dad was an IBMer and he managed to obtain an XT 286 for me.
Same year, and every engineering and CS student I knew had one in their dorm room. The lowest spec I saw was 386SX-16 with 2 MB and the highest was 486DX-50 with 16 MB RAM. Most used DOS/Windows but the CS students dual booted to Linux (Slackware mostly, some SLS).
Similar, I had a Mac SE and was maybe the only person in my dorm section to have a new personal computer. But I also knew someone who had bought a Yugo, which was more expensive.
TIL there was an XT 286. I always thought that the 286 was exclusive to the AT.
Anecdotally, I remember being excited for my dad to get a computer around 1994 or 1995, and it was a Mac IIci. A bunch of people I know got computers around then or within the next couple years, but before that it wasn't much of a thing. One of my earliest computer memories was using ClarisWorks Paint and spending (what seemed like) hours drawing a scarecrow while my parents watched, and then the computer crashed and it was all lost.
My mom had a word processor around then as well (can't remember which came first), a single purpose device for writing with a keyboard, a CRT monitor, and a printer. Looked somewhat like this but I can't remember whether it was the same brand or not.
I remember painting abstract primate pictures with MacPaint when I was quite young. I was sad when we got the new computer because I think MacPaint didn’t make the transition to system 7 or something, and my dad told me we couldn’t open the files.
We also had a IIci. It came out I’m 89, and I was shocked to learn how insanely expensive it was at launch - $6000… $14k adjusted for inflation.
That was the era of really expensive computers, but each upgrade really seemed to do something phenomenally new - color or resolution or printer or 3D etc.
Nowadays it’s hard to really notice the upgrades without running benchmarks.
We (in Germany) didn’t have one until I was 12 in 1998, before that I’d visit my dad at work sometimes to play Prince of Persia on his office computer. I guess technically we had a Sinclair ZX81 in the cellar, but I wouldn’t find out about that till much later ;)
We (in Germany), by 1998, had 3 computers at home: a 400 mhz celeron connected to the internet, a hand me down 486 for my brother, and a hand me down 386 for me. All me friends were starting to get access to pentiums with tnts, by 2000 everybody was playing UT and counterstike.
Sometime in the mid-90s the local Microcenter had a deal on a $999 PC. IIR it was running Windows-95, but was otherwise a bare-minimum system. The "deal" was that the PC also had an ad infested border around the screen that ate up some non-trivial amount of the screen space.
The line to buy it would through nearly every aisle in the store, out the front door and down the block, for over a week. People were asking for forwards on their paychecks, taking out loans, selling cars, anything they could do to get a computer at this magic price point.
The main selling point? It had a modem and people could see what this "internet" business was all about for the first time.
We had a computer at home when I was a kid and in retrospect I was surprised to realize how unusual that was. I imagine it gave me far more advantage than I think.
my dad was taking a CS course starting in 1989 and they still had to send written programs by post mail to Warsaw because it would compile faster and "more accurately". from what he described the professors would do a "code review" but you only passed if it compiled there.
Personally I talk to computers a lot these days.
It isn't riveting conversation, just stuff like "Sunset" (to make the lights warmer), "remind me to buy cheese when I get to $GroceryStore", "play $album please", but it adds up.
I also take a lot of pictures, which have become unreasonably good to the point where I'm still learning how to take a better picture with my fancy mirrorless than I can take with my phone. Both of them are computers.
After I take those pictures it does accurate analysis of what's in them, so when I search for cats, or spider, or flowers, it finds them. It does this on the device, which is pretty cool.
I have another computer that flies, I can tell it to fly circles around a target or do a bit of following. It's neither an expensive nor featureful example of its class. It flies for a real 25 minutes on one battery and weighs 249 grams.
There's another one which cleans my floor, to be honest we could have done an okay job of that in the 90s, batteries and chips were almost up to it.
Then there's the one that I can tell to make fantasy dwarves and it just does it. I think that's the one younger me would have been most impressed by.
You couldn't do it at the same time. It's nice to say "we could run a spreadsheet or listen to music", but it's almost like these were mutually exclusive. Winamp was light on resources, but if you went on to try a large-ish sheet on Excel, the music would skip, or the app would crash, or both.
The main amazing draw to Linux in the early days was that you could renice mpg123 just enough to keep audio playing while using the computer for other things.
It could also burn a CD without freezing the system or producing a coaster.
true, early pc's were really bad at multitasking but i haven't had much problems since DMA and L2 caches were available.
Modern software benefits from increased computational power because it allows new features and speeding up older ones. Sure, “office” apps don’t benefit much, but you’re ignoring many fields where they do benefit.
For example, the field of 3D graphics. Games and animated movies have become a lot more realistic and feature filled thanks to more powerful graphics cards. In fact, Disney specifically puts a lot of effort into making hair realistic. That was impractical a decade ago, and impossible a decade prior.
Meh. Are the stories being created with games getting better? For example, Half Life and Portal are pretty modern and immersive and run on some 20-year-old hardware.
The story lines and the graphics are orthogonal. It’s possible to immersive and fun games with “poor” graphics (Portal) and it’s possible to have bad storylines with amazing graphics.
Even if you’re fond/nostalgic for older hardware and games, that doesn’t mean you can’t recognize that things have improved.
Meh. Are stories being created in books getting better? For example, The Decameron and Canterbury Tales are pretty impressive and were written before the printing press.
Stellaris and Cities Skylines have incredibly detailed models that I basically never see because I always play zoomed out.
You're absolutely right about 3D graphics, but how much time does the average desktop computer user spend rendering hair?
Even if you need bigtime compute power for video games, there are game streaming services where someone else's computer will do that for you.
I have a high-end graphics card and all the processing power I need to play games... but I am still wasting all of that whenever that isn't what I'm doing, aren't I?
> I am still wasting all of that whenever that isn't what I'm doing, aren't I?
How is this different from owning anything? I have a bike, but I’m not riding it literally all the time. But I still don’t think owning it is a waste.
I feel that single-threaded processing power stopped increasing at 2 major events in history:
* The arrival of video cards around 1997 (focus shifted from general computation to digital signal processing)
* The arrival of the iPhone around 2007 (focus shifted from performance to power consumption)
I'd vote to undo these setbacks by moving to local data processing, where a large number of cores each have 1/N of the total memory, shared by M memory busses. Memory controllers would manage shuffling data to where it's needed so that the memory appears as 1 contiguous address space to any process.
In other words, this would look identical to the desktop CPUs we have today, just with a large number of cores (over 256) and a memory bandwidth many hundreds or thousands of times faster than what we have now if it uses content-addressable memory with copy-on-write internally. The speed difference is like comparing BitTorrent to FTP, and why GPUs run orders of magnitude faster than CPUs (unfortunately limited to their narrow use cases).
This would let us get back to traditional programming in the language of our choice (perhaps something like Erlang, Go or Octave/MATLAB) rather than shaders.
Apple appears to be trying to do this with their M1 and ideas loosely borrowed from transputers. But since their goals are proprietary, they won't approach anything close to the general computing power available from the transistor count for at least a decade, maybe never.
So there's an opportunity here for someone to reintroduce multicore CPUs and scalable transputers composed of them. Then we could write whatever OpenGL/Vulkan/Metal/TensorFlow libraries we wanted over that, since they are trivial with the right architecture.
This would also allow us to drop async and parallel keywords from our languages and just use higher-order methods which are self-parallelizing. Processing big data would "just work" since Amdahl's law only applies to serial and sequential computation.
The advantages are so numerous that I struggle to understand why things would stay the way they are other than due to the Intel/Nvidia hegemony. And I've felt this way since 1997, back when people thought I was crazy for projecting to the endgame like with any other engineering challenge.
> I'd vote to undo these setbacks by moving to local data processing, where a large number of cores each have 1/N of the total memory, shared by M memory busses. Memory controllers would manage shuffling data to where it's needed so that the memory appears as 1 contiguous address space to any process.
Cheap RAM is DDR. Fast RAM would be on-die but that would be very expansive, or maybe now on package (but with some tech to be developed). But appart from decoupling latencies of accesses, I don't really see the point of having N busses (from local core to its local memory), especially if you need a very large number of cores. More memory channels seems good enough. The bandwidth is already hard to saturate on well-designed SoC like the M1 Pro and above, probably improvement to the latency could yield to better benefits than trying to increase the bandwidth more.
> In other words, this would look identical to the desktop CPUs we have today, just with a large number of cores (over 256) and a memory bandwidth many hundreds or thousands of times faster than what we have now if it uses content-addressable memory with copy-on-write internally. The speed difference is like comparing BitTorrent to FTP, and why GPUs run orders of magnitude faster than CPUs (unfortunately limited to their narrow use cases).
"content-addressable memory with copy-on-write internally" are you describing what caches already kind of do, in a way (esp. if I mix that with: "memory appears as 1 contiguous address space to any process")? The good news would then be: we already have them :)
What remains, that I think I fully understand what you mean, seems to be: more cores. The other good news here is that: it is in progress. If 6 years ago you would have gotten 6 to 8 cores on an enthusiast platform, you would now probably chose 12 to 16 cores on just a basic one (and even more on a modern enthusiast one)
There has been a pause but in recent years but it was basically Intel having process difficulties, and being caught up by the rest of the industry. Including some with power consumption also in mind, and given what an high perf CPU dissipates today, power consumption has also become key to unlock raw performance anyway.
I don't know how to control for other factors, like bus speed and RAM bandwidth, but:
- 2007 single-core performance: Geekbench 5 score ~ 500.
- 2021 MacBook Air M1 single core: 1750
Ok, only a factor of 3 or so. And only 2x as many cores.
I'm comparing Core 2 Extreme to a low power portable design, albeit one with notably high single-core performance.
The shift to a focus on power consumption was already happening anyway without the iphone even on desktop. CPUs were already in the nuclear reactor territory as far as being able to produce as much heat per unit area
If developers could trade more resources usage (cpu, memory, storage, network bandwidth) for better developer ux, they would do it in a heartbeat, which is why no matter how much computing power has progressed, most softwares doesn't seem to get any faster and keep using more and more resources. On the plus side, software development is much easier today compared to decades ago.
A lot of what I'm doing now would have been insanely expensive or simply impossible when I was a kid. Just as a for instance, I have a half petabyte of video and music stored on a local server to play over my local network. That half petabyte of storage is fast enough to serve over the local network and cost less than 1/3 the price of 10 megabytes of storage in the advertisements in the article.
The difference is that you can now casually manipulate a spreadsheet of the size that would choke a supercomputer back then … on an iPad.
My watch has orders of magnitude more processing power and working memory than my first PC in the mid 90’s. It weighs maybe 200 grams and runs on battery power for ~20 hours.
If that doesn’t feel like progress then I dunno …
We sell entry level computers that choke on small spreadsheets and are less responsive than the same size spreadsheet was 25 years ago on a computer with a thousand times less computing power.
Yes, we can handle much larger data now with proper hardware, however, most people don't do that, their needs for documents and spreadsheets are just the same as it was earlier, but modern systems somehow manage to be worse despite having orders of magnitude more processing power and working memory.
The icon image for the hard drive on MacOS is larger than the entirety of the original Mac system disk.
Youtube, Netflix, and Zoom wave hello, in 1080p+ and stereo sound.
What I remember of those early computer ads is they showed a smiling family all gathered around a computer, joining in with some wholesome activity.
Firstly, that never happened. The moment my ZX81 came into our house there were nightly fights for the television (early computers needed the telly as a display). The computer completely alienated my parents and siblings who were mainly grateful that my interest in electronics and hacking was a pacifier.
Secondly, the distance between those images of technology as a connecting force and today's reality could hardly be more striking. Personal computers are objects of radical individualism. Four member of the family each staring into their own 6 inch digital world, face lit from below in blueish light would be the right image.
So the question of "how far we've come" is more nuanced than kilobytes of RAM and megahertz of processing power.
> showed a smiling family all gathered around a computer
They were always so dressed up, like they were planning on attending a wedding and at the last minute decided to play Donkey Kong.
The real world looks as it always did, as I sit here "computing" in my gym shorts and a dirty tee shirt from changing the lawnmower oil this morning.
Coincidentally I just watched a 90 second clip from The Brady Bunch. I’d forgotten how formally everyone dressed on TV … even in their own homes with no guests visiting. The father is wearing slacks and a tie. The girls wear dresses. Not realistic.
> were planning on attending a wedding and at the last minute decided to play Donkey Kong.
One of my earlier memories was sitting on my Dad's lap playing some Infocom game on his brand-new PC-AT. I didn't get to go into the study that much so it was big deal.
I still have the F series keyboard from that computer, which ended up housing my first three motherboards as well.
> sitting on my Dad's lap playing some Infocom game
That's heartening. I'm also trying to do a good job as a dad, letting my daughter have positive experiences with computers, to learn to have fun, respect but also command them, to be in control.
My mid-80s starting salary out of college with a BA was $25K - pretty good at the time and order $70K today. That $3K in the 80s would be 1/8th of my annual salary. Most people would consider that a very large expenditure and not something they could find digging under the couch cushions and breaking piggy banks.
I don't disagree with any of that, none the less, they were selling at $3K then and have to cut price to $300 to sell now, so people are obviously 10x poorer now than in the 80s. Can't argue with actual historical economic activity.
Its worth pointing out that I was paying something ridiculous like $115 for health insurance around 1990 and its running about $1750 now for my wife and I, and that's with horrifying copays and stuff. Somebody with $70K is still getting $70K, they're just spending it on rent and medical expenses now instead of $3K hard drives.
On one hand, instead of life involving $3K hard drives, now we supposedly have better medical care and nicer houses. On the other hand, its not like lifespans are increasing, LOL.
> they were selling at $3K then and have to cut price to $300 to sell now, so people are obviously 10x poorer now than in the 80s
Can you explain this? I don’t understand your reasoning. Doesn’t the issue of “cheaper to manufacture” have anything to do with price?
Eh, your economics have some issues here. Back then people were extremely stressed about spending that much. At least where I lived people could not spend that much, and a lot of computers that were bought were with financing.
On of the big things that has changed is the massive increase in housing costs and rents causing all kinds of economic issues.
It sounds like we're roughly the same age, so I won't go completely into "get off my lawn" mode. But most families in the 80s could not scare up $3K easily, it was a boatload of money; and if they could, it was for emergencies.
It's easy to remember our youth, and think of how easy it was. And it may have been for us (well, I was working or in college most of the 80s), but not our parents who foot the bills.
I grew up with a Mac so my experience was a little different, but I’m amazed at people who’s first experience sitting down at a computer was starting at an unforgiving blinking cursor.
The tragedy of old age is not that one is old, but that one is young.
- Oscar Wilde.
We were all learning the same things as you were, at about the same time, on the job (at work).
Many late hours, trying to keep up.
You did very well, I think.
Maybe the main difference for me, as a 1970s-80s kid, was that I taught myself some small bits of assembly language programming.
I feel very fortunate to have come along at a time where home computers were simple enough to understand.
The circuits were literally black boxes, so no doubt the real old-timers would chuckle at this assertion: they had to build the computers out of discrete electrical components, one transistor at a time... But I could understand a CPU that had a single register, an "accumulator"...
I have a couple TI-994a's and a C64 out in the garage that were given to me as gifts... not the original (that died a long time ago).
I learned to program on the TI, TI-BASIC first, then Extended Basic, then some Assembler. The C64 was great because you had to understand how to work directly with hardware. On the TI, you would nice library call like CALL SOUND. On the C64 you'd have to POKE everything to the correct address to coax sound out of it (often what you would do in a single line on the TI would take 4-5 lines on the C64, but the C64 was fast, and had lots of memory). Good times.
I'm a bit younger than a TRS-80, but yes, I do. It's at my parent's house. A 1989 Compaq Deskpro running OS/2. AFAIK, it still runs.
My Atari 800 died in a lightning storm in 1983, but I still have the replacement 800XL downstairs, with floppy drives, cassette drive, thermal printer, and 300 baud modem. An Ape Face paralell port adapeer for the Epson RX-80, which I don't have anymore. Lots of floppies and cartridges, too. And JForth along with the manual.
We still have my wife's first computer (Sinclair ZX81, she was the first kid in school to have their own computer), and my second (Amstrad CPC6128 & colour monitor).
I still have my first keyboard, an IBM Model F. The PC-AT which went with it was Ship of Theseuse'd into the late 486 era, I abandoned it when tower cases became standard.
The craziest thing about Byte Magazine was that it was published in Peterborough, NH. I was friends with the editor's son in the early 90s... If you've ever been to that area of NH, you'd be amazed that a high tech magazine of Byte's stature was published there. I'd be surprised if the town even had decent broadband before the 2000s. Physically and culturally, it's about as far away from Silicon Valley (where I live now) as you can get.