The dominant approach at the time were Expert Systems. This used a lot of carefully crafted data and manually curated facts that the inference engine can use. It also fit in a MUCH smaller footprint compared to conventional neural networks. But you also don’t get real language processing, reasoning beyond the target problem domain, and stuff like that - it’s laser focused and built on very small amounts of data. Much of the research from back then centers on using Lisp and Prolog of all things, so BASIC isn’t a big stretch.
The dominant approach at the time were Expert Systems. This used a lot of carefully crafted data and manually curated facts that the inference engine can use. It also fit in a MUCH smaller footprint compared to conventional neural networks. But you also don’t get real language processing, reasoning beyond the target problem domain, and stuff like that - it’s laser focused and built on very small amounts of data. Much of the research from back then centers on using Lisp and Prolog of all things, so BASIC isn’t a big stretch.
Prolog is even better suited for such applications.
who tf even uses prolog anymore (said the one still using old basic, from when it still had line numbers and everything was goto all the way down)
this is very clearly a self deprecating joke btw