Perhaps I was missing something in VB6. Life seemed so much easier when using the serial port control. Granted, MSCOMM has some serious limitations (see max limit of COM16). But back then, sending a character from ASCII 1 to 255 was a non-issue. If you sent a &HA0 out the serial port, you got &HA0 out the other end. More specifically, if your hardware was streaming binary data in to your serial port, it was simple enough to decode. In .NET that changed.
In my GCS project, my intention is to be able to handle many different types of data input, both ASCII and binary..so I had to make a decision on how to handle the data stream. Should I import it as a byte array (encoding is not an issue if you receive data this way) or as a string so I can use SubString (or Mid, Left and Right) functions on the data. Since there's no easy way to do an InStr function (searches the string for an occurance of another string) on a byte array, I opted for the string.
When dealing with human readable data, there's no problem. Even without specifying anything for the .Encoding property of the SerialPort anything that is ASCII 127 or lower will pass through the port, no problem. As soon as you start streaming binary data that goes all the way up to 255, funky things start happening. So what's the deal? Well, in .NET the data coming out of the serial port is automatically encoded (or really decoded). That's because it's sending Unicode data. This means that some characters are not a single byte but instead are represented by 2 bytes.
But I don't need any of that. I just needed one byte per character. Nothing fancy. 1 to 255 is just fine.... but there's no option to turn this "feature" off. The SerialPort1.Encoding property can be set to all sorts of settings. Generally, I knew I needed 8 bits, so UTF7 was not an option. So I tried System.Text.Encoding.UTF8, System.Text.Encoding.GetEncoding(28591), System.Text.Encoding.GetEncoding(65001) System.Text.Encoding.GetEncoding(1251), System.Text.Encoding.GetEncoding(1252). The best results came from GetEncoding(28591) initially. I thought everything was working great... but then throw in Regional settings and everything gets whacky again.
In XP, click Start, Control Panel, Regional and Language Options and change your Standards Format from English (United States) to Polish and you'll see what I mean. It's the craziest thing. I created a sample project using a com0com feedback port to pump ASCII 1 to 255 out one port in VB6 and I received it in to my GCS on the paired port. 246 of the 255 characters came through just fine. 9 of them were goofy (either a 3F...which is the "unknown" replacement by the serial port or something comepletely odd like &H54 where &H98 should have been).
So the Google search began. How if the world do I fix this or turn it off or something? The first solution was to try and use Chr on some ASCII values and ChrW on others...but that didn't fix all of the bad characters either. The data was coming in with the right hex value, but the change from Byte to ASCII was the problem. In the end, the solution was not to look at every byte. I get rid of the Serial Port's DataReceived event and instead am firing a timer every 75 ms looking for .BytesToRead > 0 and using a built-in encoding function to get the right string from the serial port.
nReadCount = serialPortIn.BytesToRead 'Number of Bytes to read
Dim cData(nReadCount - 1) As Byte
serialPortIn.Encoding = System.Text.Encoding.UTF8
nReadResult = serialPortIn.Read(cData, 0, nReadCount) 'Reading the Data
sNewString = System.Text.Encoding.Default.GetString(cData)
sBuffer = sBuffer & sNewString
Originally, I had been allowing the DataReceived event to fire after 1 character arrived in the ReadBuffer and then evaluating the new character(s) one at a time to build my strings. Without the GetString function, I was unable to get the correct string data from the serial port.
Comments
last few comments there >>> 'u can't turn off unicode'