That’s certainly the assumption behind the research.
Having a false theory, if it is so, doesn’t necessarily mean that there won’t be any useful results; no theory is perfect, but they still may give useful results.
The problem, as I see it, is that when you have a reductive theory of mind, you end up only paying attention to the things your theory explains, and anything else is silently ignored. Then even partial success is seen as a confirmation of the theory, and failures are merely indications that more development is needed.
This kind of thing may happen, but I think there are much more realistic and insidious probabilities.
The first thing is: if your mind (or brain or whatever) is hooked up to a computer, everyone will know what is going on inside your head. Forget any assumptions about privacy, as soon as you do this, the working assumption must be that any entity, whether government or commercial, can and probably will have access to whatever these things measure. Of course, what they are actually measuring is electric impulses in the brain, but fine, let’s call them “thoughts” for the sake of the argument.
The next step, and it will not be very long at all, is: they will monitor your thoughts and try to change them. Wellness apps will advise you to calm down when you’re angry. Ads will appear when you think of pizza. Governments will detect dissident thoughts and suppress them or imprison you.
Did you notice the bit where they said “read and write”? Yeah. Writing thoughts into your brain. I’m sure governments won’t do anything creepy with that.
I like my brain, and I’ll keep it my own, thanks. 