Discussion and questions related to the course Motorsport Wiring Fundamentals
Hi, I've been trying to wire and calibrate my oil temp sensor and I don't know what im doing wrong.
Ecumaster oil temperature sensor
330ohm resistor between 5v and analog3 wires
In the datasheet it shows that I should see the voltage decrease when the temperature goes up.
It mades the opposite.
I have the ecumaster oil temperature selected in the calibration menu, pull up in the analog input and the 330ohm resistor between wires.
What I'm doing wrong?
I am sure this information you are using in the PDF file is for the EMU Classic or the even older EMU. The Black has built in pull up/down resistors on all its analog inputs and are software selectable. Remove the resistor you wired in and use the software to apply the pull up resistor needed for the sensor. Based on your explanation, it sounds like you have a pull up resistor physically wired into the harness in addition to the selectable pull up/down in the software.
well, i had it wired incorrectly,
now i have it wired this way
-one wire from the sensor to analog 3
-another wire from the sensor to sensor ground.
no resistors anywhere.
these are my parameters, now i have some readings, but my sensor its reading 1.5v at 20º and max out at 0,25v at 130º
its usable but i want more definition, anyone knows what im doing wrong? the sensor data sheet says 0,0v =200º and 4,7v is 0º. almost 5v of definition.
According to the auto populated calibration table for oil temperature using the ECUMaster Oil Temp selection, 0v = 200 deg C and 3.3v = 0 deg C