Fiat 500 Forum banner
101 - 120 of 127 Posts

·
Registered
Joined
·
363 Posts
Why do you think would the cylindrical cells used in the Tesla be more robust than the prismatic cells? Does our BMS limit the low and higher end of the voltages more than the Tesla? Is the chemistry better in the Tesla or does it have better thermal management during idle/run?
Prismatic vs. cylindrical: I'm not claiming they are, it just wouldn't surprise me if they are more robust. They make a lot of cylindrical cells. There are thousands of cells in each Tesla. There are more Teslas out there than any other electric car. So it stands to reason there are orders of magnitude more cylindrical cells made than prismatic ones. Not to mention everyday things like flashlights, electric toothbrushes, etc. that use cylindrical cells. So they have a lot of practice making cylindrical cells. The metal case gives an added layer of protection during handling etc.

The original Model X and S (at least the older ones) used NCA cells which are actually more volatile than the NMC cells used in the Fiat. NCA has good energy density, but are not as stable as NMC cells. Peak cell voltage on a Tesla is apparently 4.167 volts at 100%. Not sure what that means relative to the NCA chemistry.

Perhaps the reason the data for the Tesla cells looks so much better is down to the pack design. The 85 kWh pack is made up of 16 modules of 6S74P. The parallel cells are connected to the bus bar with a thin wire that acts as a fuse. So in the case of a shorted cell, the connection to the bus burns up and the pack capacity drops by 1.3 %. What I'm not sure about is how the BMS handles a partially failed cell. If it is smart enough to take it out of the circuit, the net effect would be the same, a 1.3% reduction in capacity.

The 500e has just 97 cells in series (none in parallel). So if one gets weak it brings the whole pack down. If one fails, the entire pack fails. There is no effective redundancy in the Fiat pack. So it could be the prismatic cells with the NMC chemistry are actually more robust/reliable than the cylindrical NCA cells in the Tesla. But the lack of redundancy lowers the reliability of the pack.
 

·
Registered
Joined
·
25 Posts
Sorry, but what I got on my charges is meaningless since it was based on the car's known-inaccurate % gauge. When I checked it after I got OBD, the car was 7% lower, near the bottom end.
What do you mean your car was 7% lower near the bottom end?

If you have this data, I wouldn't dismiss this data so quickly. For example, if you have OBD & chargepoint readings over time at least I can learn how similar it is to my case. If you don't have those, it's a moot point anyway.

I am not sure why you would think battery capacity readings using the Level 2 charger is inaccurate? Are you suggesting that Chargepoint's measurements are inaccurate? Are you suggesting losses are higher with Level 2 chargers? It is hard to believe that the battery capacity is significantly higher than the Chargepoint readings.

BTW, I think it is also misleading for you to suggest that 500e's battery degradation is 2% over 10k miles in other threads.

Prismatic vs. cylindrical: I'm not claiming they are, it just wouldn't surprise me if they are more robust. They make a lot of cylindrical cells. There are thousands of cells in each Tesla. There are more Teslas out there than any other electric car. So it stands to reason there are orders of magnitude more cylindrical cells made than prismatic ones. Not to mention everyday things like flashlights, electric toothbrushes, etc. that use cylindrical cells. So they have a lot of practice making cylindrical cells. The metal case gives an added layer of protection during handling etc.

The original Model X and S (at least the older ones) used NCA cells which are actually more volatile than the NMC cells used in the Fiat. NCA has good energy density, but are not as stable as NMC cells. Peak cell voltage on a Tesla is apparently 4.167 volts at 100%. Not sure what that means relative to the NCA chemistry.

Perhaps the reason the data for the Tesla cells looks so much better is down to the pack design. The 85 kWh pack is made up of 16 modules of 6S74P. The parallel cells are connected to the bus bar with a thin wire that acts as a fuse. So in the case of a shorted cell, the connection to the bus burns up and the pack capacity drops by 1.3 %. What I'm not sure about is how the BMS handles a partially failed cell. If it is smart enough to take it out of the circuit, the net effect would be the same, a 1.3% reduction in capacity.

The 500e has just 97 cells in series (none in parallel). So if one gets weak it brings the whole pack down. If one fails, the entire pack fails. There is no effective redundancy in the Fiat pack. So it could be the prismatic cells with the NMC chemistry are actually more robust/reliable than the cylindrical NCA cells in the Tesla. But the lack of redundancy lowers the reliability of the pack.
Thank you for explaining!
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
What do you mean your car was 7% lower near the bottom end?...

if you have OBD & chargepoint readings...
11% on my gauge was 18% on OBD.

My Chargepoint readings were before I had OBD, & since my % gauge seems wrong, anything based on it also seems wrong (like how many % is added by how many Chargepoint kWh).

I am not sure why you would think battery capacity readings using the Level 2 charger is inaccurate?
I don't think L2 readings are any more or less accurate than an L1's smart plug, but we both know that the EVSE kWh are not "battery capacity readings", since about 13% of the energy is lost as heat on L1, & an unknown amount on L2.

L1 is just the only spec I know to compare, from EPA.

Are you suggesting that Chargepoint's measurements are inaccurate? Are you suggesting losses are higher with Level 2 chargers? It is hard to believe that the battery capacity is significantly higher than the Chargepoint readings.
I'd guess that Chargepoint data is likely pretty accurate, which means capacity is lower than the readings, due to heat losses which are almost certainly different with L2 than L1, but I'd guess lower with L2.

I think it is also misleading for you to suggest that 500e's battery degradation is 2% over 10k miles
You are of course free to express your opinion, just like me :):

I think it's likely a bit misleading to suggest a 500e's battery degradation is 6% over 10k miles, since 3 different users got 2% when they actually measured by the EPA method. It also aligns with many users reporting they haven't noticed any range loss.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
Some seemingly-knowledgeable folks have said that % SOC is notoriously difficult to measure, in which case even going by OBD % is likely less accurate than the EPA method of "dead to full", however...:

I believe voltage is pretty accurate, so it might at least give a better indication than %:

You could measure kWh charging the pack from say 325V to 375*.

Then you could compare to cars of different age & mileage.

Or to show your own loss over time & miles you could take readings over an extended period, assuming that EVSE efficiency doesn't also degrade over time.


* You'd have to use the same model EVSE at a specific ambient temperature & battery starting temperature, in the shade, with no breeze, & maybe some other variable(s) I'm missing, like hood & motor cover always in the same positions: Mine would use less energy for cooling, since I charge with the hood open, & my motor cover has been in the back of my garage since 2015.
 

·
Registered
Joined
·
25 Posts
11% on my gauge was 18% on OBD.

My Chargepoint readings were before I had OBD, & since my % gauge seems wrong, anything based on it also seems wrong (like how many % is added by how many Chargepoint kWh).



I don't think L2 readings are any more or less accurate than an L1's smart plug, but we both know that the EVSE kWh are not "battery capacity readings", since about 13% of the energy is lost as heat on L1, & an unknown amount on L2.

L1 is just the only spec I know to compare, from EPA.



I'd guess that Chargepoint data is likely pretty accurate, which means capacity is lower than the readings, due to heat losses which are almost certainly different with L2 than L1, but I'd guess lower with L2.



You are of course free to express your opinion, just like me :):

I think it's likely a bit misleading to suggest a 500e's battery degradation is 6% over 10k miles, since 3 different users got 2% when they actually measured by the EPA method. It also aligns with many users reporting they haven't noticed any range loss.
I see. I think even if your Chargepoint readings are before you know there is a 7% difference, it is still useful because I can back-calculate and compare.

In my experience, L2 charge values are more precise compared to my L1 charges at home measured through Kasa smart plugs (I have tried different Kasa smart plugs, but it looks like it is not the smart plugs).

I don't think I ever suggested a general battery degradation of 6% over 10k miles. I am simply showing my data and my degradation for my use case of charging and highway use. On the other hand, you proposed 2% over 10k miles as a general 500e battery degradation value here. I think that's misleading especially if the trendline of the data you have and the median of the data here is showing 6% :)

Some seemingly-knowledgeable folks have said that % SOC is notoriously difficult to measure, in which case even going by OBD % is likely less accurate than the EPA method of "dead to full", however...:

I believe voltage is pretty accurate, so it might at least give a better indication than %:

You could measure kWh charging the pack from say 325V to 375*.

Then you could compare to cars of different age & mileage.

Or to show your own loss over time & miles you could take readings over an extended period, assuming that EVSE efficiency doesn't also degrade over time.


* You'd have to use the same model EVSE at a specific ambient temperature & battery starting temperature, in the shade, with no breeze, & maybe some other variable(s) I'm missing, like hood & motor cover always in the same positions: Mine would use less energy for cooling, since I charge with the hood open, & my motor cover has been in the back of my garage since 2015.
Yes, I think in general I am trying to show my readings over time. Looks like my readings with Chargepoint chargers over the last 4 months have been pretty consistent. I'll try to continue doing the same and see what the degradation looks like.

It will be good if you can show the same measurements or degradation readings through OBD or others over time too. I'd like to compare my degradation/charges to others and see how I can optimize.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
I think even if your Chargepoint readings are before you know there is a 7% difference, it is still useful because I can back-calculate and compare.
Please explain how you would "back-calculate and compare".

Edit: I just found a more recent readout where 11% on my gauge was 16% on OBD, even though it was 18% the time before, & I have never reset the gauge. So we don't know how many % was actually added during my Chargepoint kWh readings, & I can't see any way to calculate anything from that.

the trendline of the data you have and the median of the data here is showing 6% :)
I've said over & over & over & over that I very strongly believe the data I showed there is extremely inaccurate.

But thanks for making me realize I needed to edit that into my prior post with the CHARTS BASED ON INACCURATE DATA.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
...what I can do to reduce battery wear.
Based on lab test results I've seen at PushEVs.com it seems like it's better to go from 20% to 70% if possible, than to go from 30 to 80 (which seems to be your use pattern).

& according to BatteryUniversity.com the ideal storage voltage is about 3.9V/cell, which is around 60% in a Fiat.

So ideally you'd charge to 60%, then "top off" to 70% right before driving it down to 30, then repeat.
 

·
Registered
Joined
·
203 Posts
Based on lab test results I've seen at PushEVs.com it seems like it's better to go from 20% to 70% if possible, than to go from 30 to 80 (which seems to be your use pattern).

& according to BatteryUniversity.com the ideal storage voltage is about 3.9V/cell, which is around 60% in a Fiat.

So ideally you'd charge to 60%, then "top off" to 70% right before driving it down to 30, then repeat.
Way to take any remaining joy out of owning a car lol.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
That's ONLY for @hastalavista & others who are overly concerned with range loss.

I've become much less concerned myself, as the years & miles go by without me or nearly anyone else noticing any loss at all.

Still, if you drive the US average daily distance, it's VERY easy to just start with 60% in the morning, do your round-trip, plug into a standard wall outlet before bed* & unplug when you wake up, since it's back at 60% by then.

*Even easier to use a smart plug or the car's own timer (if you have a driving schedule that's similar each day).

People who drive 120% of average could plug in an hour before bed, & unplug when leaving (after 1 hour shower/breakfast).

Even drivers who do 250% of the US daily avg, can fully recharge that entire distance overnight from the OEM cord in a dryer or oven outlet.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
In my experience, L2 charge values are more precise compared to my L1 charges at home measured through Kasa smart plugs
Thank you! Looks like smart plugs aren't very accurate.

The first link I found tested "dozens" of units, with 4.47% error for their top choice (must be "top" for reasons other than accuracy!). The most accurate one there has 1.47% error. Obviously even that won't work for our purpose of trying to measure a loss of 2% or 6%, nor can stand-alone monitors that apparently can still be off by 2%.

Supposedly the only tester accurate enough for our purposes is a $60 "clamp style multimeter":
Product Musical instrument accessory Gadget Camera accessory Tool
 

·
Registered
Joined
·
25 Posts
Please explain how you would "back-calculate and compare".

Edit: I just found a more recent readout where 11% on my gauge was 16% on OBD, even though it was 18% the time before, & I have never reset the gauge. So we don't know how many % was actually added during my Chargepoint kWh readings, & I can't see any way to calculate anything from that.



I've said over & over & over & over that I very strongly believe the data I showed there is extremely inaccurate.

But thanks for making me realize I needed to edit that into my prior post with the CHARTS BASED ON INACCURATE DATA.
I would just do a calculation based on if your charge started as 11% or 16% and see how it compares to the kwh proposed by the OBD?

I am sure there are going to be some variances on the battery degradation depending on use cases and charge conditions. Unfortunately we don't have as much data as a Tesla to have a high enough confidence interval. However, I still think this is the best we have vs just stating "500e degradation is VERY slow, at around 2%/10,000 miles". What's the data to back it up?


Based on lab test results I've seen at PushEVs.com it seems like it's better to go from 20% to 70% if possible, than to go from 30 to 80 (which seems to be your use pattern).

& according to BatteryUniversity.com the ideal storage voltage is about 3.9V/cell, which is around 60% in a Fiat.

So ideally you'd charge to 60%, then "top off" to 70% right before driving it down to 30, then repeat.
Thank you for the suggestions. I can't seem to find figures on how much improvements this would have on my battery degradation vs what I am doing now. As-is, the car will charge on the scheduled timer to between 70-80% usually. Unfortunately Fiat doesn't have a software feature to control the charge limit like almost any other BEVs. That's a mistake IMO.

That's ONLY for @hastalavista & others who are overly concerned with range loss.

I've become much less concerned myself, as the years & miles go by without me or nearly anyone else noticing any loss at all.

Still, if you drive the US average daily distance, it's VERY easy to just start with 60% in the morning, do your round-trip, plug into a standard wall outlet before bed* & unplug when you wake up, since it's back at 60% by then.

*Even easier to use a smart plug or the car's own timer (if you have a driving schedule that's similar each day).

People who drive 120% of average could plug in an hour before bed, & unplug when leaving (after 1 hour shower/breakfast).

Even drivers who do 250% of the US daily avg, can fully recharge that entire distance overnight from the OEM cord in a dryer or oven outlet.
Thank you! Looks like smart plugs aren't very accurate.

The first link I found tested "dozens" of units, with 4.47% error for their top choice (must be "top" for reasons other than accuracy!). The most accurate one there has 1.47% error. Obviously even that won't work for our purpose of trying to measure a loss of 2% or 6%, nor can stand-alone monitors that apparently can still be off by 2%.

Supposedly the only tester accurate enough for our purposes is a $60 "clamp style multimeter":
View attachment 112864
I have more accurate test equipments, but it's probably not worth it for me to test and document with that. I'll continue to rely on Chargepoint numbers to check it against battery degradation. I am pretty sure those are calibrated and checked pretty well.

Unfortunately, the tool you proposed won't work well too. Current clamps usually have single digit percentage errors based on where the cable is in the clamp and manufacturing errors too. Besides, the total power draw depends on voltage and you are not recording those with the equipment you showed.

Have you shared your commute and battery degradation data? How does it look like?
 

·
Registered
Joined
·
25 Posts
Prismatic vs. cylindrical: I'm not claiming they are, it just wouldn't surprise me if they are more robust. They make a lot of cylindrical cells. There are thousands of cells in each Tesla. There are more Teslas out there than any other electric car. So it stands to reason there are orders of magnitude more cylindrical cells made than prismatic ones. Not to mention everyday things like flashlights, electric toothbrushes, etc. that use cylindrical cells. So they have a lot of practice making cylindrical cells. The metal case gives an added layer of protection during handling etc.

The original Model X and S (at least the older ones) used NCA cells which are actually more volatile than the NMC cells used in the Fiat. NCA has good energy density, but are not as stable as NMC cells. Peak cell voltage on a Tesla is apparently 4.167 volts at 100%. Not sure what that means relative to the NCA chemistry.

Perhaps the reason the data for the Tesla cells looks so much better is down to the pack design. The 85 kWh pack is made up of 16 modules of 6S74P. The parallel cells are connected to the bus bar with a thin wire that acts as a fuse. So in the case of a shorted cell, the connection to the bus burns up and the pack capacity drops by 1.3 %. What I'm not sure about is how the BMS handles a partially failed cell. If it is smart enough to take it out of the circuit, the net effect would be the same, a 1.3% reduction in capacity.

The 500e has just 97 cells in series (none in parallel). So if one gets weak it brings the whole pack down. If one fails, the entire pack fails. There is no effective redundancy in the Fiat pack. So it could be the prismatic cells with the NMC chemistry are actually more robust/reliable than the cylindrical NCA cells in the Tesla. But the lack of redundancy lowers the reliability of the pack.
Thank you for the explanation. Do you have a good reading re: Tesla pack design? I thought I have seen cars needing to get their pack out because of a bad cell or module on a Tesla too. I have not done a lot of reading on those though.
 

·
Registered
Joined
·
363 Posts
Thank you for the explanation. Do you have a good reading re: Tesla pack design? I thought I have seen cars needing to get their pack out because of a bad cell or module on a Tesla too. I have not done a lot of reading on those though.
I found this article to be helpful: Tesla Model S Battery System: An Engineer’s Perspective
There are many other articles and discussions out there. I wasn't able to find out how the BMS handles a partially failed cell. If it doesn't take it out of the circuit, then it could significantly drop the capacity of the entire pack. Perhaps that is the issue causing the need to replace modules? Or it could be multiple cells in the same parallel circuit have failed.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
Thank you for the suggestions. I can't seem to find figures on how much improvements this would have on my battery degradation vs what I am doing now.
This article at PushEVs.com (click here) has those figures, & the difference is quite significant. The bottom line is basically the lower you can keep the max charge, the better.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
I would just do a calculation based on if your charge started as 11% or 16%...
That's my point: There's nothing to base the calculation on, since we don't know if the charging started at 11% or 16% or 18% (or something else: it may be different again next time too).

We're looking for maybe a 6% total loss for my 30k miles, & there's 7% difference just in the starting %!

You recently posted that your % gauge changes pretty drastically just while parked for a few minutes, which is pretty good evidence that any calculations based on it won't be very accurate.

I still think this is the best we have vs just stating "500e degradation is VERY slow, at around 2%/10,000 miles". What's the data to back it up?
3 different users got 2% when they actually measured by the EPA method. It also aligns with many users reporting they haven't noticed any range loss.
HOWEVER, I believe all 3 tests used L2, for which we don't know the heat loss, but I welcome you to do that test yourself, after taking it to 100% to balance the cells. I'm likely not the only one who would be very interested to see what you get.

On the other hand, I've seen no evidence at all to back up the unproven extremely wildly variable OBD SOH readings on which you're basing some of your estimates. Some of your others seem to be based on the equally untrustworthy % readings.
 

·
Registered
Joined
·
25 Posts
This article at PushEVs.com (click here) has those figures, & the difference is quite significant. The bottom line is basically the lower you can keep the max charge, the better.
Thank you for the article. This is the article I saw. I thought you might have one for batteries more similar to the one in our car.

In any case, the 80% vs 70% here don't correlate to our 70% and 80%. Do the 70% in the charts correlate with 4V in the initial chart? I was initially assuming my end charges of 70-80% to probably be closer to 70% of actual battery full charge.

That's my point: There's nothing to base the calculation on, since we don't know if the charging started at 11% or 16% or 18% (or something else: it may be different again next time too).

We're looking for maybe a 6% total loss for my 30k miles, & there's 7% difference just in the starting %!

You recently posted that your % gauge changes pretty drastically just while parked for a few minutes, which is pretty good evidence that any calculations based on it won't be very accurate.





HOWEVER, I believe all 3 tests used L2, for which we don't know the heat loss, but I welcome you to do that test yourself, after taking it to 100% to balance the cells. I'm likely not the only one who would be very interested to see what you get.

On the other hand, I've seen no evidence at all to back up the unproven extremely wildly variable OBD SOH readings on which you're basing some of your estimates. Some of your others seem to be based on the equally untrustworthy % readings.
As I said, it won't be totally useless. One example: sharing your data would mean that at least it is useful to see which of these two (OBD % vs Display %) would be closer to the kWh calculated by the OBD. I don't know why you are so against sharing your data, but I am going to stop arguing for it. We've been going back and forth on this and it seems counter-productive.

The few times I have tried going to 100% with Chargepoint L2 chargers and back-calculating (including when I started from 0%) showed that Chargepoint charges are pretty accurate and precise. I tried going through my photo album and I found one starting point I did early on. Interestingly, these charges are actually pretty close to the kWh shown in the OBD and the degradation in the SOH-C. I tried doing the same thing with L1 chargers and unfortunately they vary a lot. I can't nail down if it is due to the temperature variation as I charged the car through the night/day or other factors.

Speedometer Odometer Trip computer Light Gauge



I have actually not noticed a sudden percentage drop after charging to 100%. I only charged to 100% once in a while when I am testing and when I really needed to go somewhere without a charger at the destination. There are of course more normal charges where I didn't go to 100% (like the examples I have mentioned in previous comments) where I see a curious drop when parked after charging.

Can you explain a little bit more on where the 2% came from? 3 different cars started charging from a complete 0% with an L2 and measured the amount of charge that went in? Is this data from somewhere in the forum? What does the OBD for these cars said about the kWh of the battery?

What I would appreciate is if you or other owners can help paint a fuller picture of your car and the charging behaviors. This should help the community figure out a better way to correlate some of these OBD numbers to reality. It is relatively easy to do IMO. The only tool you need is the Konnwei OBD reader. You can then go to 0% and charge using a Chargepoint L2. No other current clamp or fancy equipments needed.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
I'll try to do a detailed reply later, but in the meantime: You can't just "go to 0%".

To balance the cells first you have to plug in until the car (not the charger) stops the charge, then discharge past 0% until the motor shuts off*, & then measure the full recharge*, in shade at around 20-25C ambient, with no breeze, the hood closed, & the motor cover on.

*That takes the inaccurate % readings out of the equation.

It would be very interesting to compare different cars' readings for that, to their OBD SOH readings.

I suggest you do it yourself & report back.
 

·
Registered
2013 FIAT 500e
Joined
·
4,213 Posts
...article I saw. I thought you might have one for batteries more similar...the 80% vs 70% here don't correlate to our 70% and 80%.
Sure, but apparently for ANY battery, the lower the peak charge voltage, the better, so...:

Do the 70% in the charts correlate with 4V in the initial chart? I was initially assuming my end charges of 70-80% to probably be closer to 70% of actual battery full charge.
Right, but it doesn't really matter, if lower peak charge voltage is better. You asked how to minimize your degradation, so if charging to 60% gets you to your next charging location, that will cause even less degradation than charging to 70%.

As I said, it won't be totally useless. One example: sharing your data...
Sharing my flawed data seems to me worse than useless. One example: I posted the computer-generated degradation trend based on the apparently-flawed OBD data, thinking it would show that the data is wrong, but you thought it showed that degradation was high!

I have tried going to 100% with Chargepoint L2 chargers and back-calculating (including when I started from 0%)...
According to the service training manual, there's a 5% reserve. So anything based on "0%" is likely incorrect.

The only way to prevent the known-inaccurate % readings from giving false results is to eliminate them from the equation.

One way to do that is to use the EPA method of discharge to motor shutoff & charge til it's shut off by the car, not the charger (which sometimes happens).

Another way might be to go by pack voltage instead of %, like I suggested earlier in this thread.

...where the 2% came from? 3 different cars started charging from a complete 0% with an L2 and measured the amount of charge that went in? Is this data from somewhere in the forum?
Yes & yes, except not from the unknown "0%", but rather from the actual motor cutoff, which is the true end of the range.

What does the OBD for these cars said about the kWh of the battery?
I don't know, but I'd be pretty interested to learn, however I don't recall that it was posted with the users' EPA-method results.

What I would appreciate is if you or other owners can help...It is relatively easy to do IMO... go to 0% and charge using a Chargepoint L2....
You're much more likely to get others to report their results if you post your own, possibly on a thread with a more appropriate title, HOWEVER:

Results won't mean much if you just "go to 0% and charge"...

You first have to charge until the car stops the charge (to balance all the cells), then discharge to motor shutoff, & then measure a full recharge at about 20-25C, in shade, without wind, with the motor cover on & the hood closed.
 

·
Registered
Joined
·
25 Posts
I'll try to do a detailed reply later, but in the meantime: You can't just "go to 0%".

To balance the cells first you have to plug in until the car (not the charger) stops the charge, then discharge past 0% until the motor shuts off*, & then measure the full recharge*, in shade at around 20-25C ambient, with no breeze, the hood closed, & the motor cover on.

*That takes the inaccurate % readings out of the equation.

It would be very interesting to compare different cars' readings for that, to their OBD SOH readings.

I suggest you do it yourself & report back.
Thank you for the explanation. Sorry I didn't get a chance to explain what I did in more details.

I basically did what you asked for. Hood closed, motor cover on, drove around my local park's parking lot that has a Chargepoint until motor stopped. Plug car in Chargepoint and wait until Chargepoint says charge complete. I didn't measure the amount of breeze unfortunately. I have also tried motor cover on and off and there was no difference in charge taken.

I have also done this 0-100% without waiting for motor stop and the difference is about 0.5kWh. Can you point me to the page that talks about the 5% reserve? I'd like to read more about that section.

Sure, but apparently for ANY battery, the lower the peak charge voltage, the better, so...:



Right, but it doesn't really matter, if lower peak charge voltage is better. You asked how to minimize your degradation, so if charging to 60% gets you to your next charging location, that will cause even less degradation than charging to 70%.



Sharing my flawed data seems to me worse than useless. One example: I posted the computer-generated degradation trend based on the apparently-flawed OBD data, thinking it would show that the data is wrong, but you thought it showed that degradation was high!



According to the service training manual, there's a 5% reserve. So anything based on "0%" is likely incorrect.

The only way to prevent the known-inaccurate % readings from giving false results is to eliminate them from the equation.

One way to do that is to use the EPA method of discharge to motor shutoff & charge til it's shut off by the car, not the charger (which sometimes happens).

Another way might be to go by pack voltage instead of %, like I suggested earlier in this thread.



Yes & yes, except not from the unknown "0%", but rather from the actual motor cutoff, which is the true end of the range.



I don't know, but I'd be pretty interested to learn, however I don't recall that it was posted with the users' EPA-method results.



You're much more likely to get others to report their results if you post your own, possibly on a thread with a more appropriate title, HOWEVER:

Results won't mean much if you just "go to 0% and charge"...

You first have to charge until the car stops the charge (to balance all the cells), then discharge to motor shutoff, & then measure a full recharge at about 20-25C, in shade, without wind, with the motor cover on & the hood closed.
Actually it is not true that a lower charge voltage is always better. Depth of Discharge (DoD) also matters. I wouldn't take it as a fact that it's better to go deeper into discharge for our battery. For my case, I am not sure that going from 75% to 25% is worse than going from 50% to 0%. Or if in the extreme case of someone's short commute that going from 50% to 40% is worse than going from 10% to 0%. Unfortunately it is very battery design dependent. What I understood as a rule of thumb was that it is better to center the discharges around the 50% point at least for Tesla's batteries.

Ok as I said, I am not going to try to elaborate my thinking about you sharing your data. It seems counter-productive.
My take on this: If others can share charge data, start points, and the OBD reads for the particular charge, we can at least have more data on degradation and how to read the OBD values for the community.

Can you point to the threads that show the 2% datapoints? I'd like to check and see if there are additional data I can glean from that. I have tried reading threads related to battery degradation but can't seem to find the datapoints you talked about.
 
101 - 120 of 127 Posts
Top