for example, UX is all about the "little things" and most UI these days fails big time in that regard. so how can an AI learn what is good if most of everything is bad?
and if the answer is that it just heavily weights Don Norman and Stunk & White, that just proves humans are lazy short-term focused jerks. So, like, we had to get to the singularity in order to get around to listening to and implementing the things that have been said, but that no-one listens to?
No comments:
Post a Comment