I doubt even Tesla’s made any decisions about anything like that yet - even the time frame given in the earnings call has Optimus deliveries being a year and a half out in the future. But it does make some amount of sense that this type of product wouldn’t necessarily be shipped with an “everything bagel” software stack, rather than something that might be more tailored to what the customer intends to use it for.
What might be more cumbersome, though, is making sure that the robot has “learned” everything specific to what the customer needs. Right now, Optimus isn’t being trained - and may not be designed to be trained - solely on vision. Per the Optimus lead designer:
“We’ve improved our locomotion stack, frequently walking off-gantry without falls and with a faster, increasingly more human-like walking gait. We’ve built a very low-latency & high-fidelity teleoperation system, used to collect AI training data of the bot imitating humans performing certain tasks… When the time is right, our manufacturing lines will be added to that list,” [Optimus Lead engineer Milan Kovac] noted.
It makes sense that Optimus’ training isn’t just based on vision, but includes data generated through teleoperation. It’s learning by reviewing data generated by humans “puppeting” the robot, not just from watching humans behave, which allows it to collect tons of tactile and kinesthetic data in addition to visual data. That’s something that they can (and apparently intend to) use when training to do jobs internal to Tesla. But that’s a process that won’t let these bots just get shipped out and watch a different factory floor and then “learn” how to use that equipment - Tesla would have to figure out how to cut out the teleoperation part of the training for that to happen.