Posted From: 184.108.40.206
|Posted on Sunday, 21 December, 2014 - 01:07 pm: |
This in the main is about SY regulator up to 1976 with the high medium and low terminals.
The regulator works by adjusting the voltage going to the field windings, more volts mean more output.
For the regulator to do this a reference voltage is needed which comes from the battery via the ammeter shunt.
The low terminal on the regulator is connected direct, the medium terminal has one resistor and the high terminal two resistors. The resistors are internal, in the regulator. The resistor are shown in the theoretical wiring diagram in the workshop manual.
What happens is that these resistors lower the reference voltage. This makes the regulator think that the voltage is lower than it actually is and the output of the alternator increases.
When this lot was designed 13.5 volts was ok for older type batteries which had lead plates. But from about 1990 car batteries have been lead antimony and lead calcium and now lead calcium lead calcium for both negative and positive plates.
For modern batteries 14.5 is about right. Ford are now 14.8. So I think is optima spiral wound. Ford and optima use silver as well.
If required a resistor in series with the feed to the high terminal, the reference voltage, will further raise the output voltage. If this resistor is shorted out with a switch then the output voltage goes down.
The field winding is about 4 amps. The ohms depends on how the regulator reacts to the voltage change.
I reckon bring the reference volts down by 1/4 a volt at a time. I did a Lucas 35 amp acr series alternator for a Morris Minor recently and it started at 13.7 at 2000 rpm and reset to 14.6 took minus 0.6v of reference voltage compensation to get there. Plus the regulator is internal. We used a piece of wire from a heater element and hid it under the alternator. We connected a long length and moved the one contact down the wire until we got 14.6. A bit of suck and see. At 700 rpm 14.1. 1200 rpm 14.6. At 30 mph in top 1500rpm. Just right. It doesn't ever exceed 14.6 regardless of rpm.
A good Clue is to measure the resistance between low med and high. The ohms will give a start point.
It is possible to overload the alternator if for instance the reference volt is lowered so much that the alternator is at 16v, the potential difference between 16 and battery at 12.6 Will cause amps to exceed the alternator rating. The amount of voltage compensation is small.
Excess amps cause overheating and it will blow the diodes.
Higher voltage will not damage the insulation, because one test is to use 110v to test the insulation.
Car diodes are rated to well over 36v. Peak inverse voltage is high as well.
With a charged battery and no loads apart from ignition the ammeter should show a reading of say 3amps rising to 6 amps when revved and voltage should be 14.6. If the amps are higher and the battery is definitely charged at 12.6 to 12.7 open circuit after standing for one hour then the charging voltage is to high for that particular battery.
Or don't do any thing, but I suggest that once a month regardless of mileage 24hours at 5 amps.
The reason is that because older cars charge at a lower voltage the modern battery never gets fully charged. Which greatly shortens its life. A slight overcharge helps prolong battery life.
It takes 72 hours at 13.8 volts to fully charge a battery so 13.5 is no where and the battery will never get there.
And due to the low miles most of us do, even if the voltage is reset periodic mains charging will still be necessary.
Just pointing out all the wrinkles.
It's nice to have the option of adjusting the charge voltage to suit should one have a below par voltage.
Note this is not a cure for a dodgy regulator. The danger being that the regulator isn't stable and it suddenly decides to work properly thus overcharging due to over compensated reference voltage. So keep an eye on the ammeter it starts flicking to high amps it means unstable regulator.
Also note that the regulator and alternator doesn't know that there is a battery just a load. The regulator is sensing the output of the alternator not the battery. If the lights are turned on and they draw 12 amps this will cause a voltage drop and the alternator will increase voltage by supplying an extra six amps. Which is instantaneous. The regulator reference voltage comes from the ammeter shunt.
I am a mechanic not an electrical engineer. So if anybody reading knows how to design a variable circuit then please design away and report. Because a small knob to turn would be good.
The low terminal is full reference voltage. The middle less and the high even less.
So a circuit that goes from say 12v to full. The amps is low. As the full voltage goes up so must the compensated voltage. The circuit is not regulated. Once set the voltage drop must be the same regardless of input voltage. Eg 12.6v in 12v out. 13.6 in 13v out. Thus 0.6v drop. Resolution 0.01v. Also a timer that circuit that makes the alternor charge at 14.8 for the first 5 mins then drops back. Also pulse charging and then no charge is handy. Size wise, match box size.
I don't know how this is done.
(Message approved by david_gore)
Posted From: 220.127.116.11
|Posted on Monday, 22 December, 2014 - 04:47 am: |
The ohms of the field windings is about 4 this has no influence on the reference voltage compensation.
If the lights are turn on 12 amp load then the alternator puts out an extra 12 amps not six. Obvious stupid mistake, what am I like.
(Message approved by david_gore)
Posted From: 18.104.22.168
|Posted on Tuesday, 23 December, 2014 - 12:05 pm: |
I have worked out how to make the device to offset the reference voltage.
The circuit is a potential divider or a voltage divider.
Two resistor in series. One end goes to earth or negative the other end goes to the wire that went to high on the regulator.
The h terminal then goes between the two resistances.
This works on the ratio of the ohms of the resistors.
To lower the volts by 0.5v,the ratio of 0.5 to 12.5 is 25 to 1.
If the first resistor is 1 ohm and the second 25 ohms then the voltage between the middle and earth will be 12.0 volts. The voltage between the middle and the live side will be 0.5 volts.
However this also means that there is a resistance of 26 OHms between live and earth which will consume about 0.5 amps.
To solve this multiply the ohms by a thousand.
So first resistor is 1000 ohms and the second 25,000 ohms. Then the drain is a 1000 times less. 0.0005 amps or half a milliamp.
To make it adjustable due to inaccuracies of resistor values.
The first resistor is a potentiometer. But because the 1000Ohms assume accuracy of the bits the pot should be 1.5kohms.
The live side goes to one side and the other side to the 25kohms resistor. The h terminal is fitted to the moving middle terminal of the pot.
How's this works imagine a length of bare resistance wire that is connected to live and earth. Then a second wire. As the ssecond bit of wire is moved up and down the first wire the voltage in the second wire will change. Nearer to earth is low and nearer to live is higher.
The values quoted need checking by an electronics guy because I think the pot value may be to low and the range may be too narrow. My brain got confused on the likely 10% error in resistors.
(Message approved by david_gore)
Posted From: 22.214.171.124
|Posted on Wednesday, 24 December, 2014 - 10:19 am: |
I have checked the values which are correct but the values are rare and not common plus resistors are 25% tolerance.
So a quick redesign and new values are two potentiometers of 5k and 200k. Instead of one pot.
These are common and about 75p each.
The 200k is a trimpot which is set once only to set maximum alternator out put, which is 14.8 volts. The idea is that the 5k pot is set to maximum position fully on the stop. Then the trim pot is adjusted to give alternator 14.8v. Then when the 5k pot is turn down the alternator output drops back to wear the standard regulator regulates which on my car is 14.15 volts at 1500 rpm. In other words the device is switched off.
The trim pot is wired as a variable resistor. The moving contact goes to one end of the 5k pot. One fixed end of the trim pot goes to earth. The other fixed end is not connected. The 5k pot is a voltage divider because the other end of the fixed terminal goes to live and moving contact goes to high on the regulator.
The amps in this circuit is very low, high terminal on the regulator is connected to transistor gates which turn the transistors on and off. Because of the high impedance of about 100k. (the 200k trimpot will be about half way.) the current coconsumption is less than 0.0001 amps. This circuit should work on any external regulator if one has less than optimum charging rate for modern batteries. Trouble is that still doesn't fix the low miles use.
I shall build it an see what happens.
Note the reference voltage is taken from the ammeter shunt this is because the regulators are made with non adjustable settings. This makes the source of the reference voltage important. However because the device is adjustable the voltage can come from any convenient source such as a ignition only live. Such as on the fuse board. Which would make the consumption of the device nil when the engines off. But because a wire still needs to go to regulator then might as well run two. The device can be fitted next to the regulator, it's size is 1x1" it depends on how big the knob is for the 5k user adjust pot is. Maplins do these bits and they have 10mm dia knobs so the device could be fitted in the car. Having a burst of 14.8v if doing lots of short journeys could, be handy.
It's small and could be fitted to the fuse box.
how about instead of a knob, a thim wheel that sticks out of the gap between the fuse box and the trim. Have it stick out say 6mm then just adjust with thumb.
Black with thumb slots.
Note this circuit must be earthed otherwise it will be unstable.
(Message approved by david_gore)