On Fri, 2024-11-01 at 22:28 -0700, Dave Close wrote:
The difficulty comes in defining "what {it} is supposed to do". If I buy an IC measuring less than 0.2 cm2, I get (or can get) hundreds of pages of documentation on what it does and does not do, exactly how it does those things, and test results. If I buy a piece of software as complex as a database, I get something called "getting started" and that's it. How can you prove they didn't do what they promised when they never really promised anything?
Well, databases used to (and still should) have a manual defining its functions, and the required syntax to use them. It's not the kind of software that you can randomly just see what it does. Of course it's up to you to construct sensible logic for how to work your data. That's the curly bit.
It's hardly the software's fault if you upset things by having a function that divides one data entry by another, and you never checked for and disallowed a non-zero entry. But if encountering an inputted zero causes it to have a giant meltdown and go on a rampage trashing all saved data, rather than simply fail a function, I would blame it for that.
And you don't expect a word processor to bomb out simply because you tried to make a highlighted word bold text. But we've probably all seen computer crashes just as stupid as that. I learnt to CTRL-S save as I go along, just about every paragraph, thanks to things like that.