Ford is testing a robotic charging station for electric vehicles

Ford is testing a robotic electric vehicle charging station that could make it easier for people with disabilities to charge cars.

A Michigan car manufacturer showed off a prototype system developed by engineers at the University of Dortmund. Germany.

It consists of a robotic arm that reaches the charging port of an electric vehicle and is controlled by the driver via their smartphone from inside the vehicle.

After charging, the lever returns to its place, and the driver can continue the journey without even leaving the car.

The robot arm reaches the charging port of the electric car, which the driver controls via his smartphone from the car.

HOW IT WORKS?

The charging station, which can be located in a car park or on the side of the road, has a sliding door that hides the manipulator.

When a driver parks near a station, he or she can open the free Ford FordPass app to open the sliding doors and release the lever.

Once activated, the cover of the station opens and the charging arm is pulled out to the vehicle’s charger using a tiny camera.

It fits perfectly into the car’s charging point, and the driver only has to wait while he charges the car.

Filling up a car with gas or plugging it into a charging station can be challenging for people with disabilities these days, Ford said.

Disabled drivers make up 5 per cent of drivers in the UK and face significant challenges when moving from one place to another.

“Ford is committed to freedom of movement and right now, refueling or recharging your car can be a major challenge for some drivers,” said Birger Fricke, Ford of Europe Research Engineer.

“A robotic charging station may be an added convenience for some people, but an absolute must for others.”

After initial lab testing, Ford researchers are now testing the robotic charging station in real-world conditions.

The charging station, which can be located in a car park or on the side of the road, has a sliding door that hides the manipulator.

When a driver parks near a station, he or she can open the free Ford FordPass app to open the sliding doors and release the lever.

Once activated, the station cover opens and the charging bracket slides out to the car’s charging port using a tiny camera.

Filling up a car with gas or connecting it to a charging station can be challenging for people with disabilities, Ford said.

Filling up a car with gas or connecting it to a charging station can be challenging for people with disabilities, Ford said.

FORD IS RECRUITTING ROBOTS-DRIVERS FOR CAR TESTING

Ford is using two robot drivers – Shelby and Miles – to test their cars in extreme temperatures.

Robots carry out tests in conditions that no human can withstand.

Shelby and Miles can operate in -40°F to 176°F (-40°C to 80°C) temperatures as well as extreme altitudes, Ford said.

Their robotic legs extend to the accelerator, brake and clutch pedals, with one hand dedicated to changing gears while the other is used to start and stop the engine.

Read more

It fits perfectly into the car’s charging point, and the driver only has to wait while he charges the car.

In the trial, drivers could track charging status through the FordPass app, which is already available and allows Ford drivers to unlock and start the car’s engine from their smartphone.

Ford said the system has been successfully tested and is currently not available for purchase.

But if it comes out in the future, it could be installed in handicapped parking spaces, car parks or private homes.

Theoretically, not only Ford vehicles will be able to use this system, although this will depend on the charging connection used in different markets.

The process could eventually become fully automated with little or no driver input, Ford says.

The driver simply sent the autonomous vehicle to a charging station to “charge” before returning home.

This will be part of a future where fully autonomous cars become the norm, though according to research firm IDTechEx, that could happen as early as the 2040s.

Autonomous vehicles are equipped with artificial intelligence (AI) that is trained to detect pedestrians in order to know when to stop and avoid a collision.

But they can only become widespread when they can be trusted to drive more safely than human drivers, and that is likely to be years away.

Disabled drivers make up 5% of drivers in the UK and face significant challenges when moving from one place to another.

Disabled drivers make up 5% of drivers in the UK and face significant challenges when moving from one place to another.

Autonomous vehicle technology is still learning to master many of the fundamentals, including recognition of dark-skinned faces in the dark.

Several self-driving cars have been involved in nasty accidents — for example, in March 2018, an autonomous Uber car killed a female pedestrian crossing the street in Tempe, Arizona, USA.

According to reports at the time, the Uber engineer in the car was watching a video on his phone.

SELF-DRIVERING CARS “SEE” WITH THE HELP OF LIDAR, CAMERAS AND RADAR

Self-driving cars often use a combination of conventional 2D cameras and “LiDAR” depth sensors to sense the world around them.

However, others use visible light cameras that take pictures of roads and streets.

They are trained with vast amounts of information and extensive databases containing hundreds of thousands of clips, which are processed using artificial intelligence to accurately identify people, signs and hazards.

In LiDAR (light detection and ranging) scanning, which is used by Waymo, one or more lasers send out short pulses that bounce when they hit an obstacle.

These sensors constantly scan the surroundings for information, acting as the vehicle’s “eyes”.

While the devices provide depth information, their low resolution makes it difficult to detect small distant objects without the help of a conventional camera connected to it in real time.

Last November, Apple revealed details of its self-driving car system, which uses lasers to detect pedestrians and cyclists from a distance.

Apple researchers said they were able to get “very encouraging results” in detecting pedestrians and cyclists using LiDAR data alone.

They also wrote that they were able to outperform other 3D object detection approaches using only LiDAR.

Other self-driving cars typically rely on a combination of cameras, sensors, and lasers.

An example is Volvo’s self-driving cars, which use about 28 cameras, sensors and lasers.

A network of computers processes information that, together with GPS, creates a real-time map of moving and stationary objects in the environment.

Twelve ultrasonic sensors around the vehicle are used to identify objects near the vehicle and support autonomous driving at low speeds.

A wave radar and a camera mounted on the windshield read road signs and road curvature and can detect objects on the road, such as other road users.

Four radars behind the front and rear bumpers also detect objects.

Two long-range bumper-mounted radars are used to detect fast-moving vehicles approaching from behind, useful on motorways.

Four cameras – two on the side mirrors, one on the grille and one on the rear bumper – monitor objects in the immediate vicinity of the car and road markings.