Tesla stock surges more than 10%


You mentioned this a few times - but I’m having a hard time understanding the concern.

AIUI, the Tesla’s forward facing camera is mounted higher, and a little further forward, than a typical human driver’s eyes would be. Neither a human nor an AI driver can see immediately in front of the front bumper, since their view is blocked by the hood. But the higher camera position for the AI would probably give the AI a smaller blind spot than the human driver. Certainly no worse than a human driver.

If that’s the case, I can’t see why this would be an obstacle to AI driving. Perhaps I’m misunderstanding the argument?


Are you kidding? A human sees that region EVERY TIME as they walk up to the car to open the drivers door! The cameras in the car can’t “walk up to the car”!

Just for the heck of it, I just walked out to my driveway to check, and as I walk up to the car, I can see everything on the ground in front of the car except for perhaps the immediate 9-12 inches from the bumper forwards (and no little kid or bike or even major toy can fit in 9-12 inches).

The essential question is - how can the car put itself into gear and drive off (presumably to pick someone up) if it can’t see if there is something directly in front of it.


Tesla/Musk has allowed transfer of FSD from an older year to a new buy, at least once before.

So, there is precedence.

Here’s a description of the current offer:

{ Earlier this month during the 2024 Shareholder Meeting, Elon Musk was convinced to agree to bring back FSD Transfers for “one last quarter”.


The transfer period is only available for orders placed between June 24, 2024, to August 31, 2024, for people taking delivery of a new Model S, Model 3, Model X, or Model Y. The Foundation-series Cybertruck comes with FSD, and non-Foundation hasn’t arrived just yet, so the transfer offer doesn’t apply to the Cybertruck. }

It seems that if you own an older year Tesla with FSD and want to trade up, you can, for a limited time, transfer that “old” FSD to a new buy…
Getting hardware 4, to go with the FSD.


Do they? Given the positioning of my car, if it’s in my driveway facing forward (away from the house) I approach from the driver’s side, and never see any part of the blind spot in front of the car - other than perhaps an oblique view of the area right in front of the driver’s side wheel. If I’ve parked head in (facing towards the house), I’ll see the area in front of the car - but none of the area behind the car at all. The rear blind spots are typically much larger for a human driver than the front ones, and (again) the rear camera of an AI car is going to have a much, much smaller blind spot than a human-only car.

Ideally, a driver would make a full circuit around the car every time before driving to make sure that there were no obstacles or obstructions that can’t be seen while sitting in the driver’s seat, both in front of and behind the car. But I doubt very much that anyone does that regularly, except when starting a car when there are small children who live in your house. We rely on parents to make sure that little kids don’t play in the blind spot area immediately in front of or behind (or under) parked cars in all other locations.

The same way a human cab driver would. If they’ve been parked somewhere (checking their phone or reading a newspaper or something), they’re not going to get out of the cab and walk around the car to make sure nothing’s moved into the surface blind spot while they were sitting. The probably didn’t to that when they first got into the car, either - but even if they did, you solve that by simply having your robotaxis start and end their days in a commercial facility (like a parking garage or lot) - which is what would probably happen even apart from this issue.

1 Like

I think, though I could be wrong, that I read that Musk stated that HW4 will not be able to support AI5 and that it would not be able to handle lvl 5 FSD. I can’t seem to find the source at the moment.

Still could not find the original source I thought I read but this comes close to making that claim (though not from Musk):

Relevant Snip:

At some point hardware 3 will hit the end of its service life, but that’s not expected for at least several years.

Sadly, Tesla has previously confirmed that they do not intend to have a hardware 3 to hardware 4 retrofit, as the size of the MCU and electrical harnesses differ between vehicle hardware iterations.

Elon Musk also officially announced Tesla’s FSD hardware 5.0, which he says Tesla is now calling “AI 5”. AI 5 is expected to hit the production lines for customer vehicles in approximately 18 months - around December 2025, with a massive slate of improvements. It is expected to be approximately 10x better than hardware 4.0, and up to 50x better in terms of inference power alone.

Of course, these massive improvements don’t come without a cost – AI 5 will consume up to 800 watts of power. In comparison, hardware 3 and hardware 4 use about 200 watts today, so don’t expect any upgrades from hardware 4.

I’ll address these one by one:

  • Rear vision is not at all relevant because the cameras in the rear are sufficient to see anything needed to be seen (I’m pretty sure about this, but if I remember, maybe I’ll test it with a toy in my driveway later on).
  • The point isn’t that humans “do” or “don’t” check the area in front of the car before driving off. The point is that humans CAN check, and probably 75+% of the time do it unconsciously (like when backed into a parking spot in any parking lot, you automatically approach the car from the front most of the time). And that the current crop of cars CAN’T check (because they don’t have eyes that move around). To be a bit morbid, if a human driver runs over their kid in the driveway, it’s a tragedy, and everyone feels very bad about the situation. But if an AV runs over a kid in the driveway, that AV system will be completely shut down within a week (or a day).
  • When a human is sitting in the car for a while, even engrossed in other stuff (a video, a newspaper), they still seem to be aware of people/stuff approaching their vehicle. And with some probability will know if someone/something stopped immediately in front of their vehicle. I can think of an exception to this, I was charging somewhere last week during the day, and I put up the windshield shade because the sun was just too powerful and I wanted to watch a video. In that case, I couldn’t see forwards and wouldn’t see anything approaching. However, before driving away, I still had to exit the car to unplug from the charger, so I could glance around to see if anyone left shopping carts/etc in my way.
  • Robotaxi is a different class I think. I agree that a classic robotaxi, let’s call it “fleet robotaxi”, will start their day at a depot (where they’ve been charging), and will likely return to there periodically (to charge some more). I also suspect that fleet robotaxis will have their cameras running almost full time (for liability, ride data, etc), so they will indeed see what is approaching from the sides at all times, and can “know” if something approached the front of the car but never left the area.
  • But the other kind of robotaxi, the kind Musk has described (“Everyone’s Tesla will be a robotaxi if they desire”) in the past, will start their day, and will start most trips in someone’s driveway or garage.
  • Everyone (everyone responsible) with small/medium children take all sorts of precautions, many of which are done unconsciously. Places where children congregate usually have lower speed limits, and where a LOT of children are present at times even have a SECOND set of human eyes outside the vehicles directing those cars (Car pool lines, school bus stops, etc).
  • The lack of vision directly in front of vehicles is a general problem, which includes human drivers. There are lots of pieces bemoaning the height of many new vehicles and the fact that the driver often can’t see for many feet in front of the vehicle, and for a substantial height (“toddler height of higher”). It’s been discussed for at least a decade, and it is why I am mystified that Tesla chose to not include that additional camera standard in all their modern vehicles. I have to wonder if Tesla, based on processing power (or whatever) already knows that none of their existing cars will ever be robotaxis, consciously decided to save the money and omit that one extra camera (in the front bumper)? The test model 3 refresh cars (“Highland”) were seen with front bumper cameras, but the eventual product does not have one. But they do have HW4, so maybe Tesla already knew that HW4 will not support robotaxi mode?

I don’t have the answer. The reason I posted is because HW3 news ceased to be news years ago so it’s not relevant. News about HW4 are relevant.

I fail to see the problem. Technology develops by leaps and bounds, you can’t expect to support older versions for long. Suppliers get around the “problem” by offering discounts to old customers.

Tesla has a “problem” if FSD cannot be transferred to new EVs. The monthly fee version solves that “problem” instantly. I hope Elon allows the transfer of full price FSD to new models.

Unfortunately I don’t know how the Inference Computer technology works so I don’t know how AI-5 is different from HW-4 and how much HW-4 can deliver from the newest versions of FSD. With the speed at which AI is developing buying an EV that lasts for 10 or 20 years is a problem. A long time ago I tried upgrading my computers instead of buying new and it turned out to be a fool’s errand.

The Captain

1 Like

I’m personally skeptical that this is ever really going to happen. Those existing cars lack the ability to open and close their doors automatically, and their internal cameras can’t see the entire interior. Going to make it very hard, operationally, for them to function as robotaxis.

But even if that’s the case, the easy fix is to simply require that the human owner circle the vehicle in the morning before putting it into the taxi fleet. Just something that’s their responsibility. They have to affirm that they’ve looked in front of the car before they can activate the Tesla Network (or whatever). If Tesla’s paranoid about it, they can require access to the phone camera so that the AI can “see” that the front of the car is clear before accepting the car into the Network for the day. Once the car’s in the Network, the cameras stay on the whole time.


I’m also pretty skeptical. It depends more on how people work than on how cars work. I suspect there will be “personal vehicles” in the first world for quite a while to come.

However, that isn’t the case that interests me most. The thing that interests me most, and the thing that will add substantial value (personal use value, not necessarily $ value) to vehicles for me is the ability to send the vehicle to places I want/need it to be. Because two of my kids needed a vehicle today, I drove my wife to work this morning and I will pick her up later this evening. That’s 4 drives of 20-30 minutes each! I want to be able to have the car do that by itself. I want to be able to send the car to pick up the kids at their school, or at their afterschool activity, or from their friend’s house. And without a front bumper camera this will never be possible to do safely enough for society to accept truly autonomous driving.

1 Like

I don’t understand why not. All you need to do is set up the AI to ask for human confirmation that the front is clear before activating from complete shutdown.

So when you want the kids to get picked up from school, you’d just have to confirm the front was clear before you sent it out of your driveway. As long as the car didn’t turn off the cameras until it returned home (a trivial thing to do), the car would always know if something had entered the blind spot - even more securely than a human would.

There’s no reason for the car to ever have to “see” the area immediately in front (or in back) except the first time it moves forward a few feet after being activated from full shutdown. You wouldn’t have any trouble using a car without a front bumper camera the way you describe.

If the car is parked somewhere, waiting to pick me (or my wife, or my kids) up, who exactly will make that confirmation? If my car is parked in my driveway, and nobody is home, who’s going to do it? If the car is parked 15 blocks away from the stadium, and I requested that it come pick us up (it’s raining cats and dogs so we don’t want to walk), who will do it? Having to have a responsible adult, with the app in their phone, connected to THAT car, and connected to Tesla backend, etc severely reduces the utility of any autonomous driving system. Hence, my statement - “No true AV without [at least] one additional camera”.

I also agree about the doors. I suspect self closing/opening doors (like Tesla model X has [in part]) may become necessary as well. It reminds me of some of the taxis I’ve used in Japan. The driver has a handle of sorts that allows him (or her, but I’ve never seen a female driving a cab in Japan) to close/open the door! I’ve heard that some modern taxis now have electric ones instead of manual ones.

You don’t need anyone to do it. When you first left home, you did the sweep - and then the car simply doesn’t turn off the cameras, even while parked. So the car knows that no children have started playing in front of it.

Because the forward camera is up near the roof, Teslas actually have a pretty small forward blind spot to begin with if the AI is driving, so this is (again) probably safer than having a human drive it from a parking spot.

The only thing you wouldn’t always be able to do is summon it remotely from your driveway with no one home as the very first drive of the day. But not only is that a really unlikely use case in the first place, it’s one that’s completely solved if the car is able to back up two feet before heading forward (as it should be able to do in a standard driveway. The rear camera has no blind spot, and if the car can back up one or two feet then the front camera will have seen all it needs to in order to ensure that there’s no humans or large objects in the blind spot.

1 Like

This is a very good idea, and I started typing about it in my previous reply. But then I thought about it some more, and in many (most?) circumstances, it physically CAN’T back up to see what’s in front. Almost all the relevant examples, like inside the garage, or even simply backed up in the driveway (to be close enough to the charger), or backed up into a parking spot at the supermarket/mall/gym/anywhere. I will say that in my driveway it could probably back up 2 or 3 feet, but I would worry that it will drive over the charger cord or hit the garage door.

Here is a snip from a Tesla M3 front camera (dashcam). Only showing the top half of the image. The small out of focus area is the front of the hood. The blind spot depth is hard to figure out…would need to put down a tape measure to be accurate


1 Like

This may not be enough to determine the blind spot size. The dashcam records only one of the front facing cameras, apparently there are two of them up there. The second camera may be more wide angle or tilted differently. But nevertheless, no matter how the cameras are positioned the front edge of the hood at the angle from the cameras may block a significant distance of vision.

Here is a better look at it.
Very approximate positions of the camera height and eye height…but the higher camera can see about 50" closer than a person on the ground. If you measure for a 1 foot tall object it is about 30" closer.
Sort of surprising that you can’t see the ground within 11.5 ft if my drawing is ~correct.



He is not interested in selling FSD at the end of the day. He is interested in captive audiences surfing online while being driven.