I have to agree with the other poster who said that serial comm standards just have too many choices. Of course that is eesy to say until you need one of those oddball configurations of parameters to get the jiob done.
It starts with the 9 pin DTE/DCE issue and manufacturers that don't standardize on using one gender for DTE and the other gender for DCE - how hard would it be to do that?? Then we have industrial equipment manufacturers who, for whatever perverse reason use RS232 with 9 pin connectors to program their devices but switch the pins around to some very nonstandard arrangement (presumably so you have to buy their overpriced programming cable). As far as i am concerned, if it talks RS232 I should be able to connect a regular 9 wire serial extension cable (male on one end and female on the other) between my PC and the device and talk to it (once i sort out baud rate, parity and stop bits of course) - how difficult would it be to do that ?? But I guess if things were done logically it would cut into the market for RS232 line status indicators, breakout boxes, and special cables.
Ok, the rest of the story... in this instance everyone was wrong and everyone was right.
The ASCII standard (not enhanced ASCII) is 7 bits plus parity. 7 bits plus the parity bit gives you an 8 bit data word.
Newport, the manufacturer of the meters and a pretty good manufacturer in my past experience says their meter supports ASCII communications, but they stillpermit the selection of 8,ODD,1 on the comm parameter setup. True to the ASCII spec they do not generate the parity bit if you set 8 data bits. But they still transmit 9 bits before the stop bit.
My customer's enduser specified the meters and the comm parameters to match some convention in their plant. It i s hard to believe that they have not encountered this issue before if they really use this equipment and set of parameters already.
The PLC will happily spit out 6-9 data bits, any parity and one or two stop bits and does not care what .data is coming out the port, ASCII, binary, or whatever. Same for the RS232 to 485 converter.
Shame on me for not paying more attention to the ASCII spec instead fo getting bogged down in RS485, which had checked out correct early on. If I or any of the tech support folks I consulted had noticed the ASCII issue the problem would have been solved sooner. As it was, it was just a late night error in setting parameters that let me learn that it would work with different settings on the two pieces of equipment and only then did I research further to discover the ASCII standard's impact on the problem.
I'm glad us control / automation types now have some new, more intelligent networks that avoid a lot of these problems. However, it still seems to me that getting the comms working is always the biggest challenge with these networks.
There were some challenges with RS 232, later EIA232, but mostly because of a lack of adequate standardization. If only one set of conditions had been standardized on then it would have worked for almost every application. But developers didn't, and the result was a chance for USB to proliferate, with it's fragile connectors that are not field servicable, nor latching. One of the goals with USB was to make the prior connections obsolete so that more equipment could be sold. That did happen.
But the real disadvantage, ignored by most, is that unlike other communications protocols, USB demands that the processor stay awake, which is a power waster. Other schemes can use a peripheral device and let the CPU stay in a sleep mode, saving power, but not USB.
As for the problem with the meters, resetting to the default mode could have worked as a starting point to discover the parity issue. Sometimes it helps to return to what somebody else made work, then move on from that point.
I can remember some 30 years back and just out of school. We were developing some firmware on a Z80 based Apple II computer and then downloding via RS232 to a Data I/O PROM programmer. Each time we downloaded, it would take over an hour. I knew it should not take that long, so I used part of my lunch hour for about three months before I got the handshaking correct. Wow, a download in less than two minutes.
I am not sure USB is much better. It seems I am always searching for a device driver for some odd hardware I end up working with.
I am happy to avoid RS-232 whenever I can. RS-485 is only a little better.
There are far to many parameters with baud rate, parity, stop bits, data bits, allowing somewhere between 9x3x2x2 = 108 software combinations. Multiply this by the 4 to 6 hardware pin out variations and you have a set of around 500 combinations of things that need to be set correctly.
Add the further complications of any marginal cabling and hardware and the initial setup time can be extensive. Further, some combinations will appear to work but if not perfect will suffer from intermittent faults that are not always readily apparent.
Next are the programming details with some drivers providing OK support and others poor support for basic functions like clear buffers and the like.
All in all I am happy to use IEEE-488/GPIB, USB and Ethernet in lieu of either RS-232/RS-485 whenever possible. There are far fewer parameters that need to be just right with any of these other protocols.
Kim, we have to be very cautious when dealing with signals. Any interruptions or disturbances in signals can brought a tremendous change in outputs. We had experienced the same problem, in our R&D lab, especially when output of one block serves as the input for next block. In most of the time at last minute only such debugging happens.
Wow - talk about Deja Vu! Now I'm going back about fifteen years. I had purchased a high power current supply to drive a big coil to test some hall effects. I was building a semi-automated system and using IEEE-488 programming. The current supply had a dip switch that needed to be configured in order to communicate properly. I followed the manual and carefully set the dip switch. There wasn't much else to do - I was hooking it to a system that had an IEEE card I was already using to talk to other equipment in the rack with no problems. The current supply worked fine in manual mode but no matter what I tried, I could not get it to communicate to the IEEE card. Until I started messing with the dipswitch. I reversed a setting to be opposite from the manual specs (only one, not all) and it started working. The company came out and bought us a steak dinner for pointing it out. I can only guess that most people were using it manually which is why it had gone unnoticed...
Amazing that something so old-school as RS-232 can still be monkeyed with. (Ha! I ended a sentence with a preposition).
The newer Tektronix and Agilent scopes (and probably others) have built-in analyzer for most of the serial communication protocols. UART/RS-232 usually comes for "free". Just probe the Tx/Rx anywhere in the chain (TTL or RS-232 levels) and the scope will quickly show baudrate, parity (Even/Odd/None) number stop-bits, number of databits and you get to measure the slew rate, the Hi/Lo idle levels, everything is there for the measurement taking.
I like running the Design Verification on RS-232/UART because it is so well estabished.
RS-232/RS-422 is old school because it works so well on so many levels. The hardware engine inside ICs is easy to design and use, the electrical signals are easy to create and there is a whole ecosystem around RS-232. Just look at the back of most large electronics; TV, Cableboxes, TiVO, EchoStar, Dish, Audio, they all have a "Communication Port" which I bet dollars to donuts is RS-232/UART.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
Using Siemens NX software, a team of engineering students from the University of Michigan built an electric vehicle and raced in the 2013 Bridgestone World Solar Challenge. One of those students blogged for Design News throughout the race.
Robots that walk have come a long way from simple barebones walking machines or pairs of legs without an upper body and head. Much of the research these days focuses on making more humanoid robots. But they are not all created equal.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.