Thoughts about Tesla FSD

Another comment I made at SeekingAlpha

I am, indeed, an optimist! I’m not an AI professional nor an expert on brains. My field has been data processing (now called IT) and problem solving which is what programmers do. During my first professional programing task my subconscious solved an intractable problem that my rational mind could not solve. Both the conscious and the subconscious had the same data to work with. Why did the subconscious succeed where the rational brain failed? The subconscious woke me up at 4AM with the problem entirely solved. I could not wait to go to the office and try it out. This was 62 years ago, that’s as long I have been pondering this issue.

For the longest time I was convinced that the (non boolean) brain is a pattern matching machine. No idea how it actually works but somehow it could take a pattern and search its memory for a matching pattern. If the match was good enough, that was the solution. For the ‘intractable problem’ mentioned above, my subconscious ‘remembered’ a computer instruction that solved the problem while I was sound asleep!

Pattern matching is what neural networks do. One thing we know about nature is that it uses redundancy, extreme parallelism, billions of galaxies with billions of stars. Millions of sperm to fertilize one lucky egg. Billions of gut bacteria. Trillions of sea creatures to create the White Cliffs of Dover. Tesla’s Dojo is not just fast, it is massively parallel, nature’s way of solving problems.

This is why I’m optimistic about FSD.

Denny Schlesinger

12 Likes

Computer neural networks are based on how we believe biological neurons work. Multiple inputs to a node (neuron) are individually weighted, summed, and then produces an output. This amounts to a big linear algebra problem – a bunch of multiply-add-accumulate operations. Vector math. And the GPU excels at this (it appears biological neurons excel at this too). Not just being highly parallel, but being optimized for FMADD operations in a way that the general-purpose CPU cannot (would require a long discussion about pipeline hazards and instruction dependencies in the CPU). The rendering of 3D graphics to a 2D display required massive amounts of vector and matrix math, so the core was optimized for FMADD operations.

The GPU is so much better than the CPU for this, but it’s also not ideal still. There are lots of things that a GPU must do to render an image that are not required for ML math. For example, all the gates dedicated to hidden geometry removal. Gates take area, take power, and they aren’t needed for ML work. This is why we’re seeing startups in this realm (Tenstorrent, Samba Nova, MythicAI). It’s also why you are seeing neural network engines on processors in addition to the CPU and the GPU on that same die (Apple A-series, Arm designs, etc.). I’m not sure about the Tesla chip, but I fully expect it to have dedicated neural network hardware that is not the CPU nor the GPU.

I do believe we eventually get impressive self driving. Eventually. (Caterpillar already has it. As does John Deere.) But I have grown completely tired of Elon saying “it’s coming next year”. Just shut up until it’s done already. :smiley:

3 Likes

Hi guys,

I’m not all coauthor up with the reading here; but, I believe the news of Cruise, purchased and then spun out from GM, operating level five in San Francisco Robotaxi’s 7 days a week from10pm to 6am (free for now) is new?

I’m fascinated by their AI is learning from the ‘proactive’ decisions of the AI and how those have been reacted to by the environment.

This is game changer stuff, IMO.

Any thoughts or links for my edification much appreciated,

Jason

Meant to include this link to David Lee interview with Cruise Management.

https://youtu.be/4xs2jBEF29g

I’m not sure about the Tesla chip, but I fully expect it to have dedicated neural network hardware that is not the CPU nor the GPU.

Early on Tesla used Nvidia GPUs, no longer. Tesla uses two chips/computers, one in the cars and a different one in Dojo which will be training the neural networks which is their own highly parallel design. Currently they are using an older supercomputer about which I have no info except it is really fast. Dojo will outperform it.

I more or less understand how neural networks are trained but have no idea how the computer in the cars use the FSD Beta software to navigate.

About Musk shutting up, not highly likely. :wink:

Denny Schlesinger

Dojo will outperform it

With Tesla chips, and now the newest Apple M1’s, is anyone else surprised that the fastest computer chips no longer come from Intel, but from Tesla and Apple?

Intel got too enamoured of the x86 architecture and the huge market it controlled. That is the problem incumbents face with disruptive innovation. If you don’t disrupt yourself, someone else will.

Denny Schlesinger

4 Likes

I don’t think any car on the road today will ever be fully self driving. I define full self driving as having the ability to bring me to my office, then drive home without me so my wife can use the car during the day, then drive back to my office at the end of the day to pick me up. The reason I don’t think any existing car will ever do it is because I strongly suspect that no car on the road today (including my new Tesla) has sufficient processing power to do so reliably. However, processors are indeed getting better and if I had to guess, the first one that will be capable will likely be produced by Apple. I just got a new MacBook Pro with an M1 Max in it, and that thing is a beast (and after last week, it isn’t even the most powerful one they have). I would say that processors will need one or two more generations before being capable of true self driving.

My Tesla has provided me with various small pieces of evidence that leads to this conclusion. And I fully admit that it is a gut conclusion, I cannot “prove” it, but my decades of experience plus those small pieces of evidence leads me to that conclusion.

3 Likes

The reason I don’t think any existing car will ever do it is because I strongly suspect that no car on the road today (including my new Tesla) has sufficient processing power to do so reliably.

I’m just watching a super interesting discussion about the approaches taken by Tesla to solve the self driving problem. Later today I’ll provide a link to it with some remarks about my experience in writing code, not AI code which I have never done, just ordinary business related code. The problems and solutions they encounter reminded me of situations I encountered in my work. The fields are totally different but there is a lot of commonality in the problem solving process. This presentation might change your mind about the arrival of self driving.

Releasing FSD in Beta to selected drivers was a risky but extremely fruitful decision. The number of situations a driver might encounter on the road are probably close to infinity for all practical purposes. You need the massive data collection to train your software. Tesla is using the AGILE problem solving method which breaks big problems into manageable chunks which allows them to improve their products, hardware and software, continuously bit by bit instead of in big chunks, model after model, years apart. I have been watching the FSD Beta presentations release after release commenting the improvements and the failures. For such a complicated problem the rate of progress is quite remarkable. More in the promised post.

Denny Schlesinger

1 Like

I’m just watching a super interesting discussion about the approaches taken by Tesla to solve the self driving problem. Later today I’ll provide a link to it with some remarks about my experience in writing code, not AI code which I have never done, just ordinary business related code. The problems and solutions they encounter reminded me of situations I encountered in my work. The fields are totally different but there is a lot of commonality in the problem solving process. This presentation might change your mind about the arrival of self driving.

I’ve written code since the 70s. Almost all real-time embedded systems.

The other issue is that Tesla code has lots of bugs, especially recent releases. That’s usually due to releasing code prematurely due to external pressures (like hurriedly trying to get code out to fix the “video play while driving” issue). In my own Tesla, I have found MAJOR safety-related bugs in the last 2 releases … and I don’t even know how to report those bugs to Tesla (to someone that will care, and do something about it). One of the bugs causes the rear-view camera to not display while driving backward. And this bug is not at all related to the [old] hardware issue with the rear view camera, one because my production date is long after that issue was fixed, and, two, it is a completely different type of issue. They also have really bad bugs in their FCW algorithm. This one is particularly bad because it may cause some people to reduce or even turn off the FCW feature! And, when it works properly, it is an essential life-saving feature (and car-saving too). Another bug causes the rear-view image while driving backwards to be jerky and even delayed substantially in many cases (this is likely due to heavy startup stuff going on the background using too much processing power). It could be that the guys doing the AGILE part of startup processing didn’t take into account the other guys working on the rear-view camera processing … which just happens to REALLY be needed when everyone backs out of their driveway immediately after getting into the car.

Again, this is my “gut” speaking, I have never seen Tesla code, and never seen their benchmarks, and test regimens (by countries/states) for proper full self driving don’t even exist yet.

1 Like

I’ve written code since the 70s. Almost all real-time embedded systems.

In that case, I won’t write the post for you but for the rest of us. LOL

The other issue is that Tesla code has lots of bugs, especially recent releases.

There is no such thing as bug free code.

Denny Schlesinger

This morning I got the latest update for BBEdit, a program I have been using for over 20 years

Fixes

Fixed bug in which incoming LSP diagnostics were ignored if the file was open, but its full directory path contained any encoded URL-unsafe characters which required decoding.

Fixed bug in which non-Markdown documentation returned by language servers (for completion, symbol help, and parameter help) did not display correctly when in dark mode.

Made a change so that shell worksheets will use zsh on macOS 10.15 and later if fish is your default shell. (Fish can’t be used in shell worksheets.)

Fixed (nonreproducible) crash which would occur while converting legacy bbcolors format color schemes and the file system API said no.

Fixed bug in which using the “Open” command with a note selected would point the panel at the Notes folder in the notebook, rather than some other appropriate location.

Fixed 13219 error reported when trying to open certain symlinked directories (like /etc) via the Open panel or command line.

Fixed bug in which dragging items between notebooks was either disallowed or dysfunctional. (The intended behavior is that dragging a note from one notebook to another will copy it; moves are not currently supported.)

Fixed bug in which using “bbedit --new-window” to open multiple files would open each file into its own window (rather than all into a single new window, as intended).

Fixed bug in which using the space bar to toggle a menu item’s visibility in the “Menus & Shortcuts” preferences did not take effect immediately.

Fixed bug in which a comma was included between the image file name and the image comment when generating a Markdown image reference via drag-and-drop of an image file.

Fixed bug in which cancelling a long-running or hung #! script would cause the application to hang.

Made a change which might (emphasis intentional) work to prevent macOS from termination the application process when it decides to, other than at the user’s request.

Fixed bug in which an instaproject’s window title would inappropriately remove the filename extension from the root folder name (which was inconvenient if the folder name ended in a decimal number, for example).

Corrected the ordering of the Notebooks section in the “Open Recent” menu.

Made a change so that the crash reporter doesn’t come up with a report window in cases where the application was exited abnormally because of some event other than a crash. (Such events would include force-quit, kill from the command line, and unwarranted process termination by the OS.)

If the Software Update window is open, there will be a corresponding entry on the Window menu, so that you can activate the window if you lose track of it.

Fixed bug in which the navigation bar icons in Live Preview windows were not correctly laid out.

Fixed bug in how certain errors returned by macOS were reported.

Made a change to allow working around broken language servers that do not exit when receiving an exit protocol notification. If BBEdit hangs when quitting because of such a server, there are two things you need to do:

Report a bug to the server implementor, because the server needs to be fixed to behave correctly.

Run this Terminal command:

defaults write com.barebones.bbedit ForceQuitLSPServerAfterExit_SERVERNAME -int 2

where SERVERNAME is replaced with the file name of the server executable on disk.

So, for example, to work around this bug with the terraform-ls language server, you would use:

defaults write com.barebones.bbedit ForceQuitLSPServerAfterExit_terraform-ls -int 2

The numeric parameter is the maximum number of seconds that BBEdit will wait before terminating the language server’s process. This value is capped at 10 seconds; in practice, for any broken server, “1” or “2” will do.

Step #2 is strictly intended to be a temporary solution for any broken server, and should not be used for longer than it takes the server developer to fix the bug and release an update.

Fixed bug in which “Hex Dump Front Document” was inappropriately enabled when an image view was active, and would crash when chosen. (Hex Dump Front Document isn’t supported for images yet, but you can still use Hex Dump File.)

Worked around macOS bug which would cause the list of revisions in svn or git “Compare Revisions” dialogs to be scrolled up by the height of the list header when the dialog appeared.

Fixed crash which would occur in the Tidy library when doing a “Clean Document” or “Reflow Document” and certain content malformations were present. (The Tidy library has also been updated to 5.8.0.)

Fixed crash which would occur when trying to use “Edit Markup” in a document fragment with a synthetic root tag (created by using a #bbpragma statement).

Fixed layout glitch in the Scripts palette and others which display folder-backed menu contents with keyboard equivalents.

Fixed bug in which toggling the “Enable language server” option did not correctly enable/disable the appropriate controls without first closing and reopening the language server options panel.

Fixed bug in which closing the Find or Multi-File Search window via its close button would activate the window immediately behind, in cases where the search window was itself not the active window.

Back to top

fin

https://www.barebones.com/support/bbedit/notes-14.1.1.html

1 Like

There is no such thing as bug free code.

This is so true, and I’ve been saying it for decades!

However, safety systems can’t have critical bugs. And I consider some of these bugs to be critical.

However, safety systems can’t have critical bugs. And I consider some of these bugs to be critical.

That’s why FSD is BETA. At IBM bugs were called “Latent errors to be discovered during field testing.” FSD BETA is ‘field testing.’

I love to watch Mentour Pilot https://www.youtube.com/watch?v=QMmA–l0HKE

It’s amazing the number of bugs passenger airplanes fly with. Your expectations are too high, unrealistic.

Denny Schlesinger

<<However, safety systems can’t have critical bugs. And I consider some of these bugs to be critical.>>

That’s why FSD is BETA.

The bugs I’ve discovered have nothing to do with FSD. They have to do with regular safety features that all cars are required to have. For example, the rear view camera has been required in the USA beginning with all 2018 model vehicles.

Your expectations are too high, unrealistic.

I don’t think expecting the rear view camera to display while the car is being driven in reverse is a high expectation. It’s among the lowest expectations of all!

Sorry, I though you were talking about software. I can’t speak to hardware having never seen a Tesla up close.

Denny Schlesinger

Sorry, I though you were talking about software. I can’t speak to hardware having never seen a Tesla up close.

I am talking about software. There is a startup problem that I see 2-3 times a week as I back out of my driveway. That’s because the software is doing lots and lots of things since I just entered the car. But everyone with a driveway just enters their car and then backs out. That’s what we call normal operation. And the rear view camera REALLY needs to work in that scenario because you are backing through a sidewalk and into an active street, so you need to see what is going on at the time.

This issue appeared 2 or 3 versions ago … that’s the version they released too quickly because of the bad PR regarding “videos being allowed to play while driving”.

1 Like

Did you report it? Have they fixed it?

Denny Schlesinger

I love to watch Mentour Pilot

That was fascinating, such detailed knowledge and research

1 Like

Did you report it? Have they fixed it?

Just received a SW update last night. I’ll check tomorrow.

1 Like

That was fascinating, such detailed knowledge and research

Enjoy!

The fellow is very professional, not only in his industry but in his presentation and speaking skills. No fillers, no repeating the same several times over to fill space.

Denny Schlesinger