@permacomputer @dirtycommo
Thinking about your #TinyBASIC project:
What I really loved about coding in BASIC on 8bits was the simplicity, mostly when it came to graphics: you would just clear screen and draw. This simplicity was lost with all the attempts to make the code portable (which is anyway always limited). But other features were actually complicating things.
If we can make interpreter really tiny, like the 4 or 8 KB, than the increased size od RAM would allow for some type of RAM disk. Because the full screen editor won't make sense with single file, because in normal BASIC interpreter, only one program is present at memory at one time, so it would not allow editing source code.
The DATA and READ is horrible approach, especially if you consider the limited amount of memory available. You have to store everything twice - once as source code and once as data. This would be ok for compilers, as you would somehow convert the DATA directly to binary data and READ would create pointers to existing strings (with numbers, it would be more tricky), but it is extremely limiting when interpreting code - interpreter, source code and binary data have to exist at the same 64 KB or RAM.
What would be nice, if BASIC would consider available RAM as kind of RAM disk, with named files. Like mini-CP/M. Or like mini-Python. But it will be still interpreter. Instead of RUN, you would have to type program name.
So EDIT EDIT would edit the source code of editor itself.
You would have to name your program, like in big languages, but you could name it RUN. So EDIT RUN would edit your "main" program file, which would be still executed by RUN. There would be even LIST, which could list the last executed or edited program - for each "system" level command, the last used argument would be supplied as default, so you could cycle EDIT, RUN and LIST, and your EDIT could be still written in BASIC.
I think making BASIC extensible in BASIC would be very interesting design choice.
@permacomputer @dirtycommo I mean, you could even hide the fact, that the main program file is named "RUN", because it would be built-in default. Only when providing arguments, it would open different program, than the default.
EDIT without arguments would just open the file called RUN, which would be executed by typing RUN. The RUN would be just kind of synonymum for main() function in C.
Now, we can consider these programs stored in memory to be like Python modules or C functions.
Because I don't expect permakeyboards to be really comfortable to use, typing as little characters as possible, while still keeping it readable for people with BASIC skill from childhood would be nice.
The modular approach to interpreter would probably allow more people to participate in the development. You would be able to save memory by deleting some of the modules loaded from ROM on boot: the modules would just sit in RAM disk and you could just use something like DEL or DELETE or CLR or whatever to remove the module.
So if you want more memory and don't want graphics, you would do DELETE GRAPHICS. And of course, you would have to somehow GOSUB into other modules, because modules could export some of their functionality (but probably not global variables? is it even possible in BASIC?)
So you could invoke EDIT from your program to edit some tech field: EDIT A$ would allow you to launch fullscreen editor for given string.. which would mean, that on system prompt level, the entire "file" in RAM disk would be passed as argument and returned, when finished.
This would really rewrite BASIC into something like nano-Python, but when I was boy, I always wanted to design "my own Basic", so these ideas actually seem to be fun :-) The core of the interpreter would have to be the basic garbage collector for strings and arrays.... and then we can implement BASIC in BASIC itself...
@permacomputer @dirtycommo It is starting to sound perhaps more like "Tiny Plan 9" :-)
@pavel On the other hand, there was something about BASIC, which made it immediately readable for everyone, although you had to be bit of a Pastafarian to enjoy the resulting spaghetti code 🙂
Python is generally great, but the readability of more complex oneliners for beginners is limited. And constructions like ''.join(...) are slightly cumbersome.
Eg. easy recognition of numbers from strings$ in BASIC was not replicated in any other language (except maybe integers starting from i, j, etc. in early Fortran - still replicated today, but coders don't know why).
Because of infinite number of BASIC dialects, the good ideas which could move it forward were never really standardized. Those few common features really resemble more board game, than programming languages. But same can be said about eg. Excell macros and don't let me start about SQL... and also, all the clones of C syntax (JavaScript, Java, PHP...) created at least as much confusion.
The line numbering really made sense on devices, where you couldn't have fullscreen editing, for any reason. When you add LABEL, ELSE, and such statements, it is no longer original spartan BASIC.
It is like saying kickbikes are inferior to bikes: in a way, yes, but I still like kickbikes. They are lighweight and the air drag can be advantage, if you don't want to have brake too much downhill. And you don't get hurt by sitting on hard saddle, even when traveling long distances. So BASIC is like kickbikes: we consider them toys, but actually, there are less parts which can break, which makes the design robust and lightweight. Kickbikes are also safer then eg. paraglides, even if both can't compare eg. with business jet...
@xChaos @pavel @dirtycommo @permacomputer
Lua is hard to beat for new programmers. Simple data structures, but decent speed.
@cian @xChaos @pavel @permacomputer I suppose what has not been mentioned yet is that the permacomputer project relies so heavily on BASIC because of its capacity for being typed in by hand at length.
This cannot be done with Lua without some difficulty.
Tiny BASIC is an excellent choice for the project's research and campaigning because:
- it fits on a microcontroller
- it has a large and vibrant culture
- can be typed in from printed paper
There is a great deal of discussion in the repo.
@cian I acknowledge, that it exists. But it lacks the advantage of being the first programming language I ever saw when I was, maybe 11 years old (?), watching someone typing-in simple cursor-key-steered character (letter) chasing random-number-steered character game on ZX-81 at local youth computer club :-)
@cian Revisited Lua again... it is definitely clean and simple, but feels like written by Pascal fans, who finally changed their mind about having to write BEGIN (which was really terrible idea on memory-starved 8bits, if you think about it) - but sticking at least with END 🙂 So it is just half-way there.
Python is like mix of BASIC without line numbers (really - not just print, but also the string immutability, def keyword, and so on) - and Pascal without both BEGIN and END (Python borrowed walrus operator from Pascal, beating both C-style langs and Pascal in having 3, not 2 styles of using = operator)
But re-imagining Python on 8bits, with so limited resources, is really hard. Yes , the source is compact, which would would be big advantage. But objects are hard to implement without memory bloat. It would be huge challenge.
Lua at least borrows == from C-style langs. I think the worst thing about BASIC is perhaps not GOTO, but the ambiguity of = operator. I know: assignment is not allowed inside expressions in BASIC, but the dual nature of = feels weird these days. I would at least allow == as an option for better readability...
@dirtycommo I am following your project for some time and I think I somehow understand your intentions. As with all other cases in my life, I would never really join you until it is too late and no one is really interested :-) and you don't really need me anyway.
But it is interesting because already since 1990s, I was in touch with people who repeatedly complained about software being "bloated", requiring too much RAM, swapping to disk, etc. I thought Linux and open source was the solution, and gave up my 1990s efforts in DOS, not really believing in FreeDOS. But you go far more back into history than FreeDOS, which is amazing.
Lot of my friends tried to retreat to something like RaspberryPi and live from quite minimal setups... but still with some Linux, some Internet, and so on. Even some graphics. I am not so hardcore, I usually use just some outdated notebook...
First of all, I don't really believe in typing-in source, I always used at least casette-tapes. But you are right, that we don't have any real removable media now! Well, microSD cards, ok. But they are probably not easy to connect to single-chip (microcontroller). So I admit, that really tiny computer would have the removable media problem these days. The cassette tapes were Overton window, which is now closed. Scanning and printing is hard.
So what remains, is some kind of flash ROM. Maybe transferring programs directly from computer to computer... just the first person types in, others can share if they meet and connect personally.
Something like CP/M could be used for that. Your local flash drive would be always A> and if you connect with outer computer, you would see other computer's flash drive as B> and you could copy files. Physical contact would be required, which would be fun. No directories needed, just filenames...
But if your target hardware does not support some kind of external link, than you really have to type...
@dirtycommo Why do you say Lua is hard to type in, again? I think you told me this once but I don't remember the answer now. Maybe something to add to the repo?
@dirtycommo @cian @xChaos @pavel @permacomputer Oh, is it that it's hard to tell what indentation a new page begins on? Whereas BASIC is never indented and the line numbers provide some redundancy?
(Me, I always found the line numbers very hard to read. Even if it's easier to type in, I still have to read afterwards what I typed in!)
The argument that does make sense to me is you are memory constrained. If there's not enough RAM for a proper editor, you're forced to use line numbers.
@pavel I took quick look at Lua, and yes, it can be described either as "Pascal without BEGIN", or "Python with END". I like the fact that It has associative arrays - {} - like Python. But try to implement associative arrays on 8bits... efficiently! even BASIC variable names in classical dialects tended to rely on only first two letters of variable name...
I have noticed, that Lua is the scripting language used by wonderfull free RPG "Battle of Wesnoth". I was also told it is extremely easy to link with C applications, because it is provided as library. So there may be reasons to look at it, because it seems to be very readable and clean design...
But for type-in enthusiasts, the line numbering and zero indentation of Basic may make sense. Also, there are platforms with little visual/fullscreen editing capabilities. And the number of people, who encountered BASIC when young and which are still alive, count at least in tens of millions. For Lua, you would get... tens of thousands?
Consider it something like doing "coding" instead of Sudoku or Wordle or something like it. It is perhaps just hobby, just quick brain and fingers workout. I sometimes do "recreational coding", not as type-in from paper, but I can understand some people may like to type code, just like some people use recipes from printed cookbooks.
I feel strange about people using AI chatbots to help them write code in relatively complex and unsafe languages, like C. I believe the programming ran into dead end, when well paid programmers tried to keep it very complex and esoteric, perhaps subconsciously, maybe to avoid competition? This eventually resulted in the AI bubble we live in now...
We need readable and safe programming language for masses, to be used instead of generating slop. Lua is one of notable attempts. Python is now very popular, but it is also progressively growing more complex.
BASIC seems like a dead end, but it would be adequate for many tasks.
@pavel @dirtycommo @cian @permacomputer
As for new platform, air-gapped from Internet, something like "meshtastic for coding"... well, I think we may all need some platform air-gapped from AI surveillance very soon.
The options are very limited. There are still lot of old PCs, which can be just easily disconnected from Internet, but this approach lacks the appeal of minicomputer-in-a-connector.
Also, we lack removable medial, of course, there are SD cards, but this require something on Raspeberry Pi level and not microcontroller/single chip.
As I mentioned before, If the permacomputers include some integrated flash ROM, I wonder about simply connecting two such permacomputers together to share files. You would just see "my ROM" and the "neighbour ROM", after booting connected machines. Nothing complicated.
It would still required human interaction to share files - no networking. We would avoid the unreadable media vs. computer with no media problem. People would be able to exchange files when they physically meet and build libraries of files.
Of course, in theory, flash ROMs have limited number of writes. So it wouldn't be really "permanent".
I don't really have any serious plan with that. I just feel "consumer informatics" is increasingly getting detached from basics of general purpose computing.
And as someone, who follows the field since 1980s and gets repeatedly beaten and obsoleted by every new innovation wave, of course I think "where it all took the wrong turn".
Lot of people do something in the field... but it seems absurdly detached from the original purpose computing was invented in the first place (like, eg., landing on Moon or so).
@pavel yes, I see, but manual typing of code is really marginal use case for line numbering.
BTW, I like the
a.x = 10 same as a["x"] = 10
feature in Lua, it seems to be the same, as in Ruby (objects/structs are just equal to dictionaries/associative arrays). It is really kind of Esperanto attempt.
The problem is, that while everything depends on dictionaries these days, they are not so easy to implement (efficiently). You already need substantial runtime for them.
My idea about "multi-program" Basic, some kind of "Tiny Plan 9", would be actually orders of magnitude easier to implement, that having useful associative arrays (dictionaries). And I feel like the permacomputing crowd appreciates to understand also the inner workings of system/runtime/interpreter...
First language to make dictionaries popular was Perl and it was not until 1990s. I am not sure about 1980s, but while performance of PC-compatibles would already allow associative arrays, the Pascal textbooks were still teaching linked lists... of course, now I am getting to my favorite topic of my "balanced binary skip lists" (TM) replacing hashes to implement dictionaries... but I have still 386-level CPUs on my mind, when thinking about this... I would not dare to try this on 8bits....
Anyway, I feel, that it is now really time to release some code and not just talk :-)
@pavel @dirtycommo @akkartik @cian @permacomputer
My wish list for alternative-history BASIC for alternative-history home 8bits, which should be actually doable:
1) variable size arrays (like DIM A without size or REDIM A, N or so)
2) appending one item to variable size arrays APPEND A,B or GROW A or so.
3) LEN() or MAX() for arrays, so you can FOR I = 0 to MAX(A())
4) Directly initializing array from constant, like DATA A() = 1,2,3 (instead of using READ in a loop, wtf...)
All of this would would require being able to reference entire array and not just one field in array - A was not A$... but I am not sure if A was same as A()?
Even the DATA ~ READ framework (which I really don't like) would benefit from being able to dynamically grow the arrays, but of course... this is not easy to do even in C :-) realloc() on simple CPUs means also relocating the data in memory. So again... linked lists for everything?
Having 4) implemented would somehow ease the need for 1) and 2), but 3) would be easy and cool anyway (and could be done without dynamical sizing). It would save source code and memory, because interpreter knows this value anyway.
Well... dynamically sizing stuff is the hard part. Knowing how much of anything you will have is easy :-)
I would still have problems with indexing the arrays with () instead of []... this would be as painful as = instead of ==.
So no... I won't ever return to BASIC... it would have to be at least some TinyLua or TinyPython... very simplified, maybe without dictionaries, but least with [ ] and == :-)
@pavel yes, I see, but you are not getting it.
Of course, EVERYTHING is possible these days, even cheaply possible. But this is not the point. There are bluetooth sharing apps on cellphones, or so.
But all this ecosystem is going to be dead in few years. Maybe even banned. The story of removable magnetic media of all types is clear: the ecosystem evolves way to fasts and nothing is "permanent". It is not very likely, that SD and microSD cards will be there forever, if the previous media disappeared.
I think the idea of #permacomputing idea is something, that haunted me since eg. 1990s.
Most of us threw away 8bits, thinking, that PC compatibles will be the standard to stay. And after 2000, it seemed, that while the format changed from desktops to notebooks, at least they are still PC compatibles, and that Linux will save us. But floppies and CD-ROMs followed the fate of cassette tapes. Now people upload everything vaguely anywhere... but nothing new really boots into anything (or at least, it feels so, and with AI chips, there won't even be such thing as "booting"...)
Going back to the stone age is radical idea, but it is obvious, that there will be practical need for air-gapped general purpose computing platform. Reliable, not dependent on software updates...
@xChaos @pavel @dirtycommo @permacomputer
Paper tape is fairly robust and easy to manufacture.
@xChaos @pavel @dirtycommo @permacomputer
For 8 bit the only language (other than assembler) that ever made sense to me was Forth.
@pavel I would like some compromise. Like NULL modem link between identical devices. So owners will be likely to exchange data willingly.
Short algorithms can be hand written or published, why not. But not all programs fit on single screen.
I admit, that the idea of "acoustic" or "unplugged" programming is fascinating ;-) But both Basic interpreters on Sharp MZ-800 had to be loaded from cassette, and there was not many code published for this type, so I usually loaded software from cassette tapes and saved my own to cassette tapes too.
Situation was perhaps different with machines, which booted from ROM directly to BASIC...
@pavel @dirtycommo @xChaos @permacomputer
But technically very simple. There's no conceivable future where we have computers, but do not have paper cards. There used to be simple machines that you could punch cards with (my secondary school had an old one kicking around in the 80s).
@pavel @dirtycommo @xChaos @permacomputer
C might be ok - though probably wouldn't be much in the way of optimizations. I wonder if Zig would work (assuming that 1.0 ever comes out).
And you could probably get a LISP working - though obviously not Common Lisp.
@cian @pavel @dirtycommo @xChaos
https://github.com/dhansel/PaperTapeReader
"This is a DIY reader for 8-bit (9-hole) vintage paper tapes. I started working on this because I couldn't find any DIY solution online that would allow for fully automated tape playback".