Back in the 80's, when I was growing up, all personal computers would come with a BASIC interpreter which you could use to write your own software. As a matter of fact, it was expected for end users to write their own applications.
My very first personal computer was an Atari 800, I was in my early teens when I got it, it was a hand me down from my uncle, who had gotten himself a shiny new IBM PC.
During that time, computer magazines came with games and applications in source code form that you had to type into your computer in order to "install" them. A lot of us didn't know exactly what all these lines of code meant, but we wanted the game or application so we typed away, unfortunately typos were an issue, since we were just blindly copying what seemed like greek into our BASIC prompt. Fortunately BASIC was interpreted, so it would catch syntax errors immediately, but many times the syntax was correct, but there was still a typo in the line, making the program not run as expected. It could be frustrating at times, but it was very satisfying to finally get the code to work exactly right. You could also experiment and make little changes here and there to see if you could change the behavior of the software. I remember eagerly waiting for the next issue of A.N.A.L.O.G magazine to arrive in the mail every month to see what goodies it would bring.
It is worth mentioning that at this time there wasn't yet a dominant computer architecture for personal computers. Some of us had Ataris (8 bit and/or ST), others had Commodores (PET, Commodore 64 or 128, Amiga), others had IBM PCs, other architectures existed as well. What all of these architectures had in common was that they all came with a BASIC interpreter. As a matter of fact, in most cases, the machine would boot directly into a BASIC prompt. The BASIC versions of the machines were not 100% compatible across one another, since vendors modified them to highlight specific features of their own products, but in general your BASIC skills could be used across architectures.
I remember been amazed at the wonderful things you could make these machines do, it got me really motivated to learn to write my own software, not simply blindly typing code listings from magazines. A lot of software developers from that era got our start that way, at the time, the barrier of entry for software development was very low. I derived a lot of satisfaction in creating software, I would proudly show my creations to my friends and relatives. All of these got me motivated to pursue a career in software development, which is what motivated me to major in computer science when I went to college.
Somewhere in the 90's most of these various architectures disappeared, and the one true personal computer platform emerged, the IBM PC, or what we simply call a PC today. Just like all the platforms of the time, the IBM PC came with a BASIC interpreter, but unlike the others, BASIC wasn't built into the operating system, it was something you had to look for if you wanted to use it. When the PC became the de facto standard, the focus of having end users as programmers started to decline. Magazines stopped coming with BASIC listings for you to type in. When DOS 6.0 came out, PCs even stopped coming with a BASIC interpreter altogether. Now if you wanted to develop software, you had to install a compiler or interpreter yourself, which, sadly, is still the case today.
So I wonder, how do new generations of software developers get their start? It is not as easy to "get your feet wet" these days like it was back in the day. I wonder if they pick computer science without knowing exactly what they are getting into? It's a shame that software development is not as accessible as it once was.