Sunday, July 10, 2005
I Get It, I Think
We have been helping a client for several months to publicize a new technology. The technology on the surface is simple, and it works well. Moreover, the beta test of the technology is succeeding, even with normal gear-up glitches.
Imagine my surprise then, when I began to examine data from the test and started to ask questions about it. For some reason, the data didn't make sense. Several of us in the office looked it over and couldn't begin to understand why it was the way it was and what the data was actually saying. That's not a good sign because, although we are not engineers, we are also not known to be ignorant. The challenge facing us is that the data had to be easily understandable to reporters who are also not engineers but not stupid. This sparked late last week a series of questions and discussions that showed me that I hadn't understood at all what the client has been trying to do. In fact, I am not sure that the client fully understood either until we began to raise questions without answers.
One fact we learned is that the client's instructions for reading the data could be misinterpreted -- and were. This means the instructions need rewriting. A second fact we learned is that the client at this juncture is not trying to achieve a statistical validity that we would consider normal and moreover, the client sees no reason for doing so. That is still an open issue, it seems to me, but we had not been qualifying the data in the right way when we talked about it. We'll fix that.
But, again, I am humbled by my own lack of understanding. All this time researching the technology, and I still didn't get what the client is doing. I do now, I think.
Imagine my surprise then, when I began to examine data from the test and started to ask questions about it. For some reason, the data didn't make sense. Several of us in the office looked it over and couldn't begin to understand why it was the way it was and what the data was actually saying. That's not a good sign because, although we are not engineers, we are also not known to be ignorant. The challenge facing us is that the data had to be easily understandable to reporters who are also not engineers but not stupid. This sparked late last week a series of questions and discussions that showed me that I hadn't understood at all what the client has been trying to do. In fact, I am not sure that the client fully understood either until we began to raise questions without answers.
One fact we learned is that the client's instructions for reading the data could be misinterpreted -- and were. This means the instructions need rewriting. A second fact we learned is that the client at this juncture is not trying to achieve a statistical validity that we would consider normal and moreover, the client sees no reason for doing so. That is still an open issue, it seems to me, but we had not been qualifying the data in the right way when we talked about it. We'll fix that.
But, again, I am humbled by my own lack of understanding. All this time researching the technology, and I still didn't get what the client is doing. I do now, I think.
Comments:
Post a Comment