Connected/Automated vehicles and infrastructure in Boston

I wasn’t aware of this thread! A recent and unfortunate loss for Boston in the AV space was Optimus Ride. Magna acquired them with no intention of further developing the products, instead looking to apply their SW tech towards driver assistance tech. Optimus was a MIT 6 year old start-up developing autonomous shuttles for campuses and potentially city centers. They did testing in the Seaport and worked closely with the city on future expansions, but their primary markets were southern/west coast sites.

 
I want self-driving 6-passenger 15mph shuttle-on-bikepath to provide "last 2 mile" service:
- Alewife to Arlington Heights
- Wellington to Malden
- Assembly to Encore Casino
- Winchester to Woburn & Stoneham
- East Boston to Suffolk Downs (west short of the peninsula)
- Watertown-Harvard Sq-Kenmore-Charles MGH-North Station
- Grand Junction BU to MIT
- Wonderland to Point of Pines
 
I want self-driving 6-passenger 15mph shuttle-on-bikepath to provide "last 2 mile" service:
- Alewife to Arlington Heights
- Wellington to Malden
- Assembly to Encore Casino
- Winchester to Woburn & Stoneham
- East Boston to Suffolk Downs (west short of the peninsula)
- Watertown-Harvard Sq-Kenmore-Charles MGH-North Station
- Grand Junction BU to MIT
- Wonderland to Point of Pines
A self-driving 6-passenger 15mph vehicle in mixed ped/bike/e-skateboard etc. traffic seems dicey to me. These characters on the e-boards and e-bikes scoot around like maniacs. And most bike paths are only one lane wide at best, usually less
 
1) Even averaging 8mph to adapt to mixed path traffic a shuttle would do 2 miles in 15 minutes-- about half the time of walking
2) Most of the paths could be widened cheaply and easily, since the vehicles are lightweight and narrow. --and quiet

I don't see that "characters on e-boards and e-bikes" should be any harder than any other kind of mixed traffic, and given the low average speeds of all in involved, should be easier to do safely
 
I want self-driving 6-passenger 15mph shuttle-on-bikepath to provide "last 2 mile" service:
- Alewife to Arlington Heights
- Wellington to Malden
- Assembly to Encore Casino
- Winchester to Woburn & Stoneham
- East Boston to Suffolk Downs (west short of the peninsula)
- Watertown-Harvard Sq-Kenmore-Charles MGH-North Station
- Grand Junction BU to MIT
- Wonderland to Point of Pines
I would add a two branch Anderson/Woburn to Burlington and Billerica, and to Lexington via Woburn Four Corners via the power line rights of way to that list. A narrow path to accommodate such a shuttle could be built at a fairly low cost, though it would have to contend with passing through the Mill Pond Conservation Area and its eponymous reservoir, which might run into some environmental and land use issues. (This might be more of a crazy transit pitch)
 
Last edited:
1) Even averaging 8mph to adapt to mixed path traffic a shuttle would do 2 miles in 15 minutes-- about half the time of walking
2) Most of the paths could be widened cheaply and easily, since the vehicles are lightweight and narrow. --and quiet

I don't see that "characters on e-boards and e-bikes" should be any harder than any other kind of mixed traffic, and given the low average speeds of all in involved, should be easier to do safely
It's hard to guess what the traffic dynamics would be. An engineering study with modeling and test runs would be needed to assess the feasibility. But it may turn out to be doable with increased controls. restrictions, signage, striping, etc.
 
Yeah, I've seen that called a "miniBRT", the bike-speed shuttle running on a path. Very popular idea with certain college campuses etc. Especially if the vehicle is all electric and light weight.

The market might have spoken to some extent however- this was basically Optimus' strategy and they couldn't pull it off.

Right now the "toys for rich people" half of the industry is having a much easier time finding investment funding, unfortunately. In case you couldn't tell, I like transit and basically only care about AV's in so much as they can contribute to that, so I'm kind of meh on this trend.
 
#1 challenge for autonomous shuttles is requirement for serving mobility challenged (ie wheelchair) customers. federally/state funded entities require this with transportation services. Extremely difficult to do without human operator/assistant in the vehicle.
 
^ Whenever a driver is intervening, the driver is training the AI.

They say this. But theres no real proof.

For YEARS they said every Tesla had "shadow mode" where the AI was learning what drivers were doing.

Either they lied, or what they learned was garbage.

It reminds me of online translation software. A DECADE of "learning" and yet it still makes 2nd grade mistakes

The other example I point to are voice assistants (Siri etc). Frankly, theyre still crap. Amazon, Google, and Apple have ENORMOUS amounts of data. And they do a great job of "set an alarm for 9am" but anything more complicated and they just run a google query.
 
I personally think the main focus is really about transporting goods more than people.
 
They say this. But theres no real proof.

For YEARS they said every Tesla had "shadow mode" where the AI was learning what drivers were doing.

Either they lied, or what they learned was garbage.

It reminds me of online translation software. A DECADE of "learning" and yet it still makes 2nd grade mistakes

The other example I point to are voice assistants (Siri etc). Frankly, theyre still crap. Amazon, Google, and Apple have ENORMOUS amounts of data. And they do a great job of "set an alarm for 9am" but anything more complicated and they just run a google query.
not sure about that. I was an early Alexa adapter about 7 years ago and the difference is pretty noticeable.
Every now and again she asks me if she's responding from the right device and other questions designed to learn. The advances in home automation have been huge. It's a good example of seamless affordable technology becoming commonplace and growing in place. I find Siri completely annoying tho.
All this to say that I've no idea how much a tesla learns but I'd say there's a fair amount of bluffing till someone else has developed the tech. But it is coming.
 
not sure about that. I was an early Alexa adapter about 7 years ago and the difference is pretty noticeable.
Every now and again she asks me if she's responding from the right device and other questions designed to learn. The advances in home automation have been huge. It's a good example of seamless affordable technology becoming commonplace and growing in place. I find Siri completely annoying tho.
All this to say that I've no idea how much a tesla learns but I'd say there's a fair amount of bluffing till someone else has developed the tech. But it is coming.

I think it's absolutely fair and correct to point out where advances have been made, and I would agree from my own experiences that things like voice assistants and home automation have certainly improved over the years (though it's out of my wheelhouse to say how much of that is programming versus learning AI).

On the other hand, there's an enormous difference between something like Alexa or any kind of home automation and autonomous vehicles. Apart from potentially a home automation system that can control the heating (or a stove, I guess) there's not likely to be any safety implications if those technologies screw up (whether from bad programming or poor AI performance), whereas an autonomous vehicle's screw-ups can easily kill people. So it only goes so far to point out where advances have been made, because it's equally clear that they haven't been sufficiently made to ensure safety - and it's also quite clear that Tesla either doesn't realize that or (in my opinion more likely) doesn't care enough to treat that fact with the responsibility required. That they may well get it right eventually doesn't justify not doing enough to keep people safe while they figure it out.
 
I think it's absolutely fair and correct to point out where advances have been made, and I would agree from my own experiences that things like voice assistants and home automation have certainly improved over the years (though it's out of my wheelhouse to say how much of that is programming versus learning AI).

On the other hand, there's an enormous difference between something like Alexa or any kind of home automation and autonomous vehicles. Apart from potentially a home automation system that can control the heating (or a stove, I guess) there's not likely to be any safety implications if those technologies screw up (whether from bad programming or poor AI performance), whereas an autonomous vehicle's screw-ups can easily kill people. So it only goes so far to point out where advances have been made, because it's equally clear that they haven't been sufficiently made to ensure safety - and it's also quite clear that Tesla either doesn't realize that or (in my opinion more likely) doesn't care enough to treat that fact with the responsibility required. That they may well get it right eventually doesn't justify not doing enough to keep people safe while they figure it out.
yea, seems like Tesla are trying to run before they can walk. Having an automated system for private drivers in parts of Boson is mad.
Seems like you get automated highway driving down, then point to point haulage down, then look at zoning specific areas in cities.
Not - 'hey, you got an upgrade, your car can drive around the north end on its own now'.
I'm probably preaching to the quire but seems to me that you could have total private automation between major built up areas and then public automation within those areas.
I can see a time in my kids life when manually driving vehicles around very built up areas is considered mad.
 
The actual critical difference between the two is where the heavy lifting occurs. Alexa doesn't do much more than detect the trigger phrase locally -all the processing happens on the server. (The processor in an Alexa is the same as a decent cellphone from 2014 or so, iirc) So they can make it better with better server hardware without you needing to upgrade your hardware locally.

That doesn't fly on a car, at least any car infrastructure that isn't entirely insane (there are some people working on cloud computing enabled autonomy, which is a BAD idea, IMO). You have to carry your computer with you. And we're talking a lot of power.

This is basically why a lot of companies are trying to figure out where to cut corners. A trustworthy lvl5 autonomous vehicle needs ~10kW while tesla's lvl3-pretending-to-be-lvl4 system needs like 500W.

Computers that can do the heavy inference work that benefits from very large training data sets are server farm sized, not car sized. Moore's law might change this someday, but you can't cheat physics with code, no matter how much machine learning you can leverage.
 
One of the similarities is that neither Tesla nor the voice assistance have any permanence.

The Tesla will make the same error every single time it encounters the problem. There are videos where a guy makes a left turn, does it 100+ times and still shit every time.

There are also videos where the Tesla "loses" an object. One showed the Tesla at a railroad crossing. It saw the train and then immediately tried to drive straight into it because it didnt understand that there was more train.

Voice assistants are similar. You cant have a conversation with it. It will understand specific tasks (open x, call y, set z) but has no understanding of context.

IE: Siri call my dad (easy) and siri whats the weather there (will have no idea what youre asking). A human obviously would.
 
Very good comment. Permanence in that sense is a solved technical problem, in that we CAN teach a computer how to make inferences based on semantic context, but it takes some processing oomph. For Siri, the extra server processing power would cost Apple too much, so they limit the functionality. For Tesla, it just can't fit in their little onboard computer. Perfect example of one of the things all that extra computer in a real AV is doing that Tesla isn't doing.

It's also a good example of a way they could make Siri better by investing in more/better servers without changing your hardware.

Also, the train is a good example of a place where some gosh-darn lidars would have helped if Tesla didn't have an irrational hatred of them.
 
Very good comment. Permanence in that sense is a solved technical problem, in that we CAN teach a computer how to make inferences based on semantic context, but it takes some processing oomph. For Siri, the extra server processing power would cost Apple too much, so they limit the functionality. For Tesla, it just can't fit in their little onboard computer. Perfect example of one of the things all that extra computer in a real AV is doing that Tesla isn't doing.

It's also a good example of a way they could make Siri better by investing in more/better servers without changing your hardware.

Also, the train is a good example of a place where some gosh-darn lidars would have helped if Tesla didn't have an irrational hatred of them.

All true, which brings us to the real issue...

Everything is possible, but that doesnt mean its financially feasible. Maybe we could have a level 5 self driving car today, but if it requires $500k in equipment and a $500k supercomputer onboard, then its not really relevant. And no company is testing that because they're not doing theoretical research, theyre looking for a profitable product to sell.
 
Interesting video of Tesla FSD in San Diego.


2:40 runs red light
3:22 runs over bollards
6:25 tries to drive on rail line
7:45 bollards again

he does this same loop every month, so another example of the AI failing to learn. The tracks arent moving, but it still doesnt understand.
 

Back
Top