wifi not detected Results

Page 1 of 2.
Results 1...20 of 28

Sponsored Links:


This is my first post, so sorry if I don't know all of the protocol. I just installed Windows 7 two days ago. I have a Toshiba laptop. I'm running Win 7 Home Premium, 64-bit. Problem is, it just disconnected me from my wireless network randomly. Worked fine all day, then just stopped. I reset both my cable modem and the wifi router, to no avail. I'm plugged in right now for access.

I ran the troubleshooter and it says "problem with wireless adapter or access point." The adapter I have is an Atheros AR5007EG. (When I look in the system info tab, it doesn't even know what the name of the adapter is -- everything says "Not Available") Windows says I have the right drivers, so I'm not sure what to do next. I sometimes had a wifi connection problem with Vista -- I would just disable the firewall, very bad, I know -- to work around it.

So, if someone can help with this, it would be great. I don't quite know all the nuances of how to get to .exe files and things like that, so if there's a less technical way to explain it, that would help!

Many thanks!

Hi guys,
So... I'm trying to connect two computers which are on the same network. both computers connect to the network via wifi... I want to connect them to the home group (or whatever it's called) to share printers and stuff like that but when I create a group in one of the computers the other one simply acts as though there's no group and offers me the only option of creating a new group...

What may be causing it and how can I fix it?

Thanks alot!

Hello everyone.

I bought a Broadcom Wifi Bluetooth combo card off of ebay. It's PCIe interface, and I installed it and everything works. I have 1 issue though, the drivers for it are all installed, but the bluetooth will not detect anything I want to connect to it.

I do have the drivers installed on the machine for the bluetooth. But now there's other crap popping up, but it was from my older bluetooth USB dongle that I use to have.

I am VERY confused, and I'm tech savvy so I think there may be some software issue, where the command for the bluetooth module isn't turned on.

Broadcom Bluetooth wifi bluetooth combo -
HP Broadcom Half BCM4313 802.11n wifi Bluetooth 2.1 BCM94313HMGB 600370-001 card

Help please, I'll give rep.

Every time I start my computer from a cold boot Windows 7 installs all my hardware drivers (wireless card, audio card, etc.)

I havn't seen anything on google regarding my problem, so maybe I will be able to get some help from the "Windows 7" forums. ha

Windows: Windows NT6.1 (Build 7100)
Internet Explorer: 8.0.7100.0
Memory (RAM): 2048 MB
CPU Info: Intel(R) Core(TM)2 CPU 6320 @ 1.86GHz
CPU Speed: 1804.3 MHz
Sound card: Speakers (12- High Definition A
Display Adapters: NVIDIA GeForce 8600 GT | NVIDIA GeForce 8600 GT | RDPDD Chained DD | RDP Encoder Mirror Driver | RDP Reflector Display Driver
Monitors: 1
Screen Resolution: 1280 X 1024 - 32 bit
Network: Network Present
Network Adapters: Microsoft Virtual WiFi Miniport Adapter #8 | Broadcom 802.11g Network Adapter #8 | VIA Rhine II Fast Ethernet Adapter
CD / DVD Drives: F: ASUS DVD-E616P3H
COM Ports: COM1
LPT Ports: LPT1
Mouse: 6 Button Wheel Mouse Present
Hard Disks: C: 37.3GB | D: 37.3GB | E: 189.9GB
Hard Disks - Free: C: 22.2GB | D: 32.2GB | E: 103.8GB
USB Controllers: 5 host controllers.
Firewire (1394): Not Detected
PCMCIA (Laptops): Not Installed
Manufacturer: American Megatrends Inc.
Product Make: PT890T-A
AC Power Status: OnLine
BIOS Info:
Time Zone: Pacific Standard Time
Battery: No Battery
Motherboard: ECS PT890T-A
Modem: Not detected

Thanks in advanced,

Hey everyone,

So it has come to this. I've searched all possible answers and discussions and forums for a solution to my problem, but I just can't find anything. I own an Acer aspire one 532h or AO532H. Intel atom N450 (1,66GHZ), 1gb ram, Atheros AR5B95 wireless network adapter... Tell me if you need anything else.

So what happened is I got myself a netbook, it had W7 starter preinstalled. Worked fine, wifi worked aswell with the router I'm trying to connect to now. Decided to try ubuntu netbook remix. Did that, wifi worked aswell, didn't like it overall, came back to W7.

WIFI is not working anymore What I mean by that is that when I reboot my computer sees the networks. When I try to connect to my router it starts connecting and then sudenly it says that there are no connections available.

Updated the drivers,
Uninstalled the drivers, installed the drivers
ipconfig /release /renew /all -> Media disconected
Yes fn + f2 turns on the wireless. The amber light is on, I think I can't even turn it of :P
Works wired
Stoped wireless service and started wireless service
Tried an usb adapter, detects networks but can't connect to them :/ (weird)

What else can I give you?

Well I will try to give you all the info necessary to help me
Using a toshiba laptop that is about 4 months old and have been connecting without any problem at all locations that I go to, until now.
Auto update 2 days ago, since that time have not been able to connect to internet, shows networks but says limited availability and has red X even though the signal strengh is excellent.
after reading a lot of posts have tried many things and determined that when I try to look at the ip config, "windows did not detect any networking hardware"(
Have a atheros AR8152 PCI-E Fast ethernet controller
microsoft virtual WiFi miniport adapter
realtek RTL8191SE wireless LAN 802.11n PCI-E NIC
All devices have the latest drivers and say in properties that they are functioning properly.
Have tried to disable them restart computer and enable them one at a time, without success.
Please help need to turn in school work by Monday
Thanks to all for your time and help
newbie to forum

I’ve been running Windows 7 Build 7000 since February without any major issues (the biggest being my Netgear wifi dongle not having a driver, which wasn’t fixed with much hassle involved). Yesterday I decided to install RC1, and strangely I’ve been having a few problems since.

Monitor Issues
After installing RC1 and restarting my PC after installing a few programs, I experienced no issues with my monitor. I then later restarted it after having installed a few more programs (all programs I used in Build 7000) and upon startup, I found out that Windows couldn’t detect my monitor. I looked around in settings to see if anything was amiss, but I saw nothing, so I shut down my PC, unplugged my monitor and restarted. No luck. I shut down and unplugged it again, then I restarted it and it detected it fine.

It was working fine for most of today. I’ve done numerous restarted, but this evening randomly upon restarting it didn’t detect my monitor since then. I’ve tried doing what I did yesterday but to no avail. It detects my monitor as a “Generic Plug & Play Monitor,” meaning I’m running at a really whack resolution and can’t seem to figure out what’s causing the problem.

I haven’t experienced any issues with my monitor until now.

Wireless Network
For some reason, at random intervals, Windows seems to lose the connection my wireless network (after having at least 80-100% signal all the time). In order to reconnect I need to unplug my dongle and re-plug it in. This is especially annoying when I leave overnight downloads as the network will disconnect and not reconnect itself.

Any ideas as to what’s causing the problem? I was previously running XP Professional SP3 and wasn’t experiencing any issues, but I noticed this happening (albeit far less often) in W7 Build 7000 too.

Thanks for the help

I've searched all over the internet, spent over an hour with dell support and found nothing. I'm visiting my parents for the summer and I brought my laptop with me and the wireless was working fine for the first month or so. One random day it stopped working and will not connect. I called dell support and for some reason it just started working when I was on the phone with him, but for the next couple of days it would connect for awhile then not connect 15 minutes later.
It's not the router because it won't connect to other networks, the dell support guy says its not the hardware (everything in the device manager says its fine also), It works when the ethernet cable is plugged in, I installed all the latest drivers for the network adapter (Intel Wifi link 1000 BGN), I installed a registry fix program and for a little bit it was working with some networks, but only limited access. I've tried everything I can think of and I'm stumped So if anyone could help me that would be great.

Hey. I installed Build 7057, and my Dynex Enhanced Wireless-G USB Adapter (DX-EBUSB) won't work. The problem started with the device just ceasing to work after 5-10 minutes of use. I uninstalled and "reinstalled" the drivers, and there's still no luck. The device can detect my Router, but it won't connect to it. I get the "limited access" popup. Something else to note is that during the driver installation, it gives a "Please plug in the device to a USB port" message, although it won't recognize the plugged-in device. Weird, eh?

This is weird, since I was able to DL the 64-bit driver for Vista/W7 Build 7000, and it would work.

Hey. I installed Build 7057, and my Dynex Enhanced Wireless-G USB Adapter (DX-EBUSB) won't work. The problem started with the device just ceasing to work after 5-10 minutes of use. I uninstalled and "reinstalled" the drivers, and there's still no luck. The device can detect my Router, but it won't connect to it. I get the "limited access" popup. Something else to note is that during the driver installation, it gives a "Please plug in the device to a USB port" message, although it won't recognize the plugged-in device. Weird, eh?

This is weird, since I was able to DL the 64-bit driver for Vista/W7 Build 7000, and it would work.

My new laptop with Vista Home Premium can detect the wireless network at my job but will not connect. i have tried a manual connection but with no luck. The computer sees the network as unnamed and unsecured even after a manual set up it never "sees" it properly.

The signal strength is excellent. every computer here with Xp can connect but none of our Vista laptops are able to.
Any ideas or does anyone know which fix to down load?

I use my home network to stream music and movies from my computers to my Xbox and TV. Now all of a sudden, nothing. Both computers are running Windows7 Ultimate x64, connected online through wifi, and connected to the network homegroup (that works without any issues), but neither or them detect any of my media devices. I have a Sony Viera ST30 TV and an Xbox 360, both having wired connections to the same network that the computers are on. Features that require the internet on both work perfectly fine. And I have a viera app on my phone that I used to use to control my tv (the app may not be compatible with my phone so I'm not really worried about it). Even having sharing enabled on the computers, they just wont detect my TV or my Xbox.

Here's the doozy. If I connect my laptop to the network via wired connection, my devices are found. If I connect to a friend's wireless network, their devices are found as well.

There has been no change on either computer from the last time it was working, till now.

The only thing I can possibly think of is it has something to do with the network map below. I just have no idea how to fix it.

Attached Thumbnails Share Share this post on Digg Del.icio.us Technorati Twitter
Reply With Quote .postbitlegacy .postfoot .textcontrols a.post_info_button, .postbit .postfoot .textcontrols a.post_info_button { background: url(/images/post_infobox.png) no-repeat transparent left; padding-left: 20px; } .postbitlegacy .postfoot .textcontrols a.post_info_button:hover, .postbit .postfoot .textcontrols a.post_info_button:hover { background: url(/images/post_infobox-hover.png) no-repeat transparent left; JavaScript must be enabled 12-18-2012 #2 IAmTambo Junior Member Thread Starter Enjoys Windows 7 Forums
Join Date Dec 2012 Posts 2 Re: Media device detection problem... Forget it. I fixed it myself. 315 views and not one person could help. USELESS.

I recently got a notebook computer, and so I dug out the old 802.11g
router so I could use it around the apartment. In my old residence, I
used it, but there were no other networks detectable, and I had no
problems. Now I have neighbors with WiFi networks.

Now, about every 5-10 minutes, I lose my connection. It seems to be
when the list of available networks changes, either losing sight of a
network or finding a new one. I have to click the taskbar icon, and
reconnect every time it happens (though since I am listed as the
prefered network, it is always the one selected). The computer is only
30 feet at most from the router, and the connection is always
"excellent", but the connection gets dropped every 5-10 minutes and I
have to manually reconnect.

Is there anyway to prevent the dropping of the network connection, or
to reconnect automatically (I'm thinking along the lines of the old
"keep alive" programs for dialups)? it is just very annoying, not only
to reconnect, but the fact is I am retyping this message because I
lost the connection as I was trying to send this.

Thanks in advance

John R Rybock

Samsung LC-20. XP SP3. Has built-in wifi.

Trying out external wifi adapters (ones with better, more directional
aerials). Two different models.

One I tried a few months ago - I don't have it to hand so can't tell you
the details, but it had a dish aerial. (No, not a Hawking.) I can't
remember the details, other than that it did work briefly, then started
to cause the PC to spontaneously reset when plugged in.

The latest one I've tried is this: http://bit.ly/10xdGox . When I first
plug it in, it doesn't cause problems of the resetting sort; it is
recognised as new hardware, but no existing driver is found. When I
install the driver from the little CD that came with it, the
installation appears to complete, but then the PC crashes. If I do it
without the new hardware plugged in, all seems well - until I plug it
in, when it crashes within a second or three: quite spectacular crash,
in that the screen goes blank, and the PC then restarts - as if
something is drawing too much from the power supply or something. (Both
have been USB devices.) If I system restore to before I loaded the
drivers, I can plug the device in, and as before no crash occurs, the
device is detected, but no in-built driver is found.

I've tried both disabling the built-in wifi, and not doing so; it
doesn't seem to make any difference.

Any ideas?

(I have one more to try, which I'll try tomorrow as it's getting late:
http://bit.ly/10xf9ev . Since that one claims to work on '98, I can also
try it on an old '98 laptop I have.)
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

A true-born Englishman does not know any language. He does not speak English too
well either but, at least, he is not proud of this. He is, however, immensely
proud of not knowing any foreign languages. (George Mikes, "How to be
Inimitable" [1960].)

I've been fighting to clean a virus off my wife's Win 7 laptop. We've battled to a standstill, but I believe the enemy is still lurking on the battlefield (the laptop) and I need the ultimate weapon to win this war!

I'm going to put the details of my battles to date here in case this helps someone else identify a similar problem and find a solution quicker that I have.

The Battle

I first noticed the malware when a click on a link in Google went to an obviously wrong website, but using (right-click)"Open in new tab" went to the correct webpage. I then looked for MSE in the systray to run a full scan, but it wasn't there. The malware was apparently disabling MSE and would also kill it immediately every time I tried to run MSE manually from the Start menu.

After some research (on another PC), I booted into Safe Mode and ran SuperAntiSpyware and Malwarebytes. Both were freshly downloaded on another PC and transferred to the infected PC via a USB stick. Neither scanner found anything.

Further research listed some possible suspects and a process to use to kill (even temporarily) the offending item. Running AUTORUNS (from MS Sysinternals), I found a suspicious .DLL in an unusual location. The file, BITSADMINQ.DLL, was located in the C:Users...AppDataRoaming folder.

Now for the freaky part: I had not started AUTORUNS in Administrator mode, so I couldn't delete the item. I closed AUTORUNS and did (right-click)"Run as administrator" to run AUTORUNS with elevated privilege to eliminate the entry. I couldn't find the entry again! It was still there because, after renaming the file, on the next reboot I received a pop-up RUNDLL message that the file was missing, so there is an entry somewhere to run this file on start-up.

Using the information gained on the first run, I located the RUNDLL entry using Process Explorer and killed the process. I could then start MSE and I ran a full scan. Still no hits! However, with the process disabled, I was able to use AUTORUNS to locate the malware registry key in
"HKCUSoftwareMicrosoftWindowsCurrentVersionRu n"
and remove it.

To verify this file was indeed malware, I uploaded the .DLL file to VirusTotal. The results were mixed. Only 7 of the 42 scanners run identified the file as malware: Avast 4 and 5, BitDefender, F-Secure, GData and Ikarus. The other scanners were happy with it. Thinking I now had a process that could completely identify and clean the offender, I burned the bootable BitDefender CD (again on another PC), and rebooted the infected PC using this CD. I was running the ISO file dated January 2011, but the CD did not detect my WiFi network connection so it couldn't download the updated signature list. This scan also didn't identify any malware, especially this .DLL file, so this file was still just "suspect".

Since running with the .DLL file renamed didn't appear to cause anything to break, I shredded the file, but I still get the feeling there may be pieces of malware lingering.

The Aftermath
I know it's impossible to prove a negative, but I'd like some way to feel good about this system again. I feel "icky" using that laptop because of my lingering doubts.

Also, I'm concerned the USB stick I used may have gotten infected, so I'm looking for advice to safely verify it is clean.

Suggestions and comments very welcome!

Hello i have windows 8 pro purchased yesterday, all fine! But if i go to twitter.com in chrome browser or internet explorer and i try write a tweet with geo tagging, is impossible...auto geo tagging is not available shows me error... Don't detect my position automatic... in windows 7 in any browser no problems... in any app of weather in windows 8 or google maps web...the same...windows don't recognize my position automatic...its a bug of windows 8? If i try the same in my iphone or ipad which both are in the same wifi network in my home..the geotag auto works fine!! I think which is not my router problem and only windows 8 problem...Please help...in twitter is geotag ON and windows configuartion too...What is the error??Please answer!

I have a windows 7 HP Pavillion laptop which i use primarily at my house over a wireless internet connection through a netgear wpn824 wireless router. all of the other machines (vista and XP) in my house can access the internet both wirelessly and over ethernet wires with "autodetect settings" on the windows 7 machine I had to manually assign an IP address. It's fine at my house but if I go out to a public WiFi hotspot I cant connect with either my IP that is in there or if I place it on "autodetect", this is the same way for a wired connection. so my question is can I either have a profile for my house with the IP added and them another for when I'm away or make it so it can automatically detect all the time. it sees all of the wireless connection no matter what but I get "limited connection" if I'm not at my house with an assigned IP address.

I use a Linksys Wireless-G adapter, having a lot of problems getting it to detect/install the drivers.

In device manager it detects as an unknown device (and both plugging in/unplugging the device has windows make a noise as it should). However when I try to install the driver via device manager - directing it to my CD drive and the drivers folder, it still says it can't find the drivers.
Installing from the disc doesn't seem to work either.

I've been reading a few posts but I don't think others were having the same exact problem. I saw the suggestion to uncheck box 6 in network properties, but I'm not sure what that was referring to. Also router setup - accept anonymous connections and in the Wifi Properties window - conf adapater adv. - 802.11x to long and short.
I looked around a bit and didn't see those options, so maybe those become available after the drivers are activated. I'm really unfamiliar with Win7 (and Vista for that matter) so could use some very specific help.

In this article, we will give an overview of the technical side of Project Detroit, the Microsoft-West Coast Custom Mustang creation. If you're not already familiar with this project, you can find more information here.
Key Design Decisions

Its important to keep in mind that this car was built for a TV show with a set schedule. As a result, there are a number of unique design decisions that came into play.


Working backwards, the reveal for the car was set for Monday November 28, 2011 at the Microsoft Store in Bellevue, Washington. We started the project in early August, which gave us approximately 12 weeks for research, development, vehicle assembly, and testing. This was by far the #1 design decision as any ideas or features for the car had to be implemented by the reveal date.
Off the Shelf Parts

Another key design decision was to, where possible, use off-the-shelf hardware and software in order to allow interested developers to build and reuse some of the subsystems for their own car (at least the ones that dont require welding). For example, instead of buying pricey custom sized displays for the instrument cluster or passenger display, we used stock Samsung Series 7 Slate PCs and had West Coast Customs do the hard work of building a custom dash to hold the PC.

Hardware and Networking

The car is packed with a variety of computers and networking hardware.

Instrument Cluster Slate This slate is on the driver's side and manages the instrument cluster application and the On-Board Diagnostic (OBD) connection to read telemetry data from the car.Passenger Slate This slate, which is built into the passenger's side, runs a custom Windows 8 application (see Passenger slate below).
Laptop 1 This laptop runs the REST service to control different parts of the car, the Kinect socket service for the front Kinect, and the user message service to display messages on the rear glass while driving.Laptop 2 This laptop runs the Heads Up Display (HUD) service, the Kinect socket service for the back Kinect, the OBD-II database, and Azure services.Windows Phone A Nokia Lumia 800 connects via WiFi and a custom Windows Phone 7 application (See Windows Phone application below).Xbox 360 The Xbox 360 displays on either the passenger HUD or the rear glass display.Networking A NETGEAR N600/WNDR3700 wireless router provides wired and wireless access for everything in the car, which is used in conjunction with a Verizon USB network card plugged into a Cradle Point MBR900 to provide an always-on 3G/4G LTE internet connection. The slates, laptops, and Xbox 360 are connected via CAT5e cable, while the Windows Phone 7 connects via WiFi.
Note: One of the limitations of the Kinect SDK is that if you have multiple Kinects plugged into one PC, only one of those Kinects can do skeletal tracking at a time (color/depth data works just fine). Because of this, we decided to have a dedicated laptop plugged into the front Kinect and another laptop plugged into the back Kinect in order to allow front and back skeletal tracking at the same time. If we'd not used simultaneous skeletal tracking, we could have combined all of the systems onto a single laptop.

Here is a quick overview of the application architecture.

REST Service Layer

The REST Service Layer allowed different systems talk to one another. More importantly, it allowed different services to control hardware they normally wouldn't be able to access.

Thin client approach
The solution we chose was to have all the services that control different parts of the car reside on the laptops and have client applications like the Windows Phone application send REST commands to execute an action so the service layer would execute the request.REST-enable hardware
Controlling hardware should be invisible to the consuming clients. For example, hardware that requires USB communication would be impossible to control with a Windows Phone. The service layer allowed us to control hardware in a way that was invisible to the end user.Helper Libraries
To simplify communication with the service layer, we built a set of helper classes to abstract out repetitive tasks like JSON serialization/deserialization, URI building, etc. For example, to get the list of car horn ringtones, the client application can call HornClient.List() to get back a list of available ringtone filenames. To set the car horn, the client calls HornClient.Set(filename), and to play the car horn, it then calls HornClient.Play(filename). The Helper libraries were built to work on Windows 7, Windows 8, and Windows Phone 7.

We have already released an article and library on the OBD-II portion of the car. In short, OBD-II stands for On-Board Diagnostics. Hooking into this port allows one to query for different types of data from the car, which we use to get the current speed, RPMs, fuel level, etc. for display in the Instrument Cluster and other locations. OBD can do far more than this, but it's all we needed for our project. Please see the linked articles for further details on the OBD-II library itself.
For the car, because only one application can open and communicate with a serial port at one time, we created a WCF service that polls the OBD-II data from the car and GPS data from a Microsoft Streets & Trips GPS locator, and returns it to any application that queries the service.
For the OBD library, we used a manual connection to poll different values at different intervals. For values critical to driving the carlike RPM, speed, etc.we polled for the values as quickly as the car could return them. With other values that werent critical to driving the carlike the fuel level, engine coolant temperature, etc.we polled at a 1-2 second interval. For GPS, we subscribed to the LocationChanged event, which would fire when the GPS values changed.
Rather than creating a new serial port connection for every WCF request for OBD data, we created a singleton service that is instantiated when the service first runs. Accordingly, there is only one object in the WCF service that represents the last OBD and GPS data returned, which is obtained by the continual reading of the latest OBD data using the OBD library as described above. This means that calls to the WCF service ReadMeasurement method didnt actually compute anything, but instead serialized the last saved data and returned it via the WCF service.
Since WCF supports multiple protocols, we implemented HTTP and TCP and ensured that any WCF service options we chose worked on Windows Phone, which, for example, can only use basic HTTP bindings.
To enable the ability to change the programming model later and to simplify the polling of the service, we built a helper library for Windows and Windows Phone that abstracts all the WCF calls.
The code below creates a new ObdService class and signs up for an event when the measurement has changed. The Start method does a couple of things: it lets you set the interval that you want to poll the ObdService, in this case every second (while the instrument cluster needs fast polling, the database logger can poll once a second). It also determines what IP address the service is hosted at (localhost), the protocol (HTTP or TCP), and whether to send demo mode data. Since one of the main ways the car is showcased is when its stopped on display, demo mode sends fake data, instead of always returning 0's for MPH, RPM, etc., so people can see what the instrument cluster would look like in action.

_service = new ObdService(); _service.ObdMeasurementChanged += service_ObdMeasurementChanged; _service.Start(new TimeSpan(0, 0, 0, 0, 1000), localhost, Protocol.Http, false); void service_ObdMeasurementChanged(object sender, ObdMeasurementChangedEventArgs e) { Debug.Writeline("MPH= + e.Measurement.MilesPerHour); } OBD-II Database & Azure Services

To record and capture the car telemetry data like MPH, RPM, engine load, and throttle (accelerator) position, as well as location data (latitude, longitude, altitude, and course), we used a SQL Server Express database with a simple, flat Entity Framework model, shown below. The primary key, the ObdMeasurementID is a GUID that is returned via the ObdService. Just like above, the database logger subscribes to the ObdMeasurementChanged event and receives a new reading at the time interval set in the Start() method.

The Windows Azure data model uses Azure Table Services instead of SQL Server. The data mapping is essentially the same since both have a flat schema.
For Azure Table Storage, in addition to the schema above, you also need a partition key and a row key. For the partition key, we used a custom TripID (GUID) to represent a Trip. When the car is turned on/off a new TripID is created. That way we could group all measurements for that particular trip and do calculations based on that trip, like the average miles per gallon, distance traveled, fastest speed, etc. For the row key, we used a DateTimeOffset and a custom extension method, ToEndOfDays() that provides a unique numerical string (since Azure's row key is a string type) that subtracts the time from the DateTime.Max value. The result is that the earlier a DateTime value, the larger the number.
Time=5/11/2012 9:14:09 AM, EndOfDays=2520655479509478223 //larger
Time=5/11/2012 9:14:11 AM, EndOfDays=2520655479482804811 //smaller
Since they are ordered in reverse order, with the most recent date/time being the first row, we can write an efficient query to pull just the first row to get the current latitude/longitude without needing to scan the entire table for the last measurment.

public override string RowKey { get { return new DateTimeOffset(TimeStamp).ToEndOfDays(); } set { //do nothing } } public static class DateTimeExtensions { public static string ToEndOfDays(this DateTimeOffset source) { TimeSpan timeUntilTheEnd = DateTimeOffset.MaxValue.Subtract(source); return timeUntilTheEnd.Ticks.ToString(); } public static DateTimeOffset FromEndOfDays(this String daysToEnd) { TimeSpan timeFromTheEnd = newTimeSpan(Int64.Parse(daysToEnd)); DateTimeOffset source = DateTimeOffset.MaxValue.Date.Subtract(timeFromTheEnd); return source; } }

To upload data to Azure, we used a timer-based background uploader that would check to see if there was an internet connection, and then filter and upload all of the local SQL Express rows that had not been submitted to Azure using the Submitted boolean database field. On the Azure side, we used an ASP.NET MVC controller to submit data. The controller deserializes the data into a List type, it adds the data to a blob, and adds the blob to a queue as shown below.
A worker role (or many) will then read items off the queue and the new OBD measurement rows are placed into Azure Table Storage.

public ActionResult PostData() { try { StreamReader incomingData = new StreamReader(HttpContext.Request.InputStream); string data = incomingData.ReadToEnd(); JavaScriptSerializer oSerializer = new JavaScriptSerializer(); List measurements; measurements = oSerializer.Deserialize(data, typeof(List)) as List; if (measurements != null) { CloudBlob blob = _blob.UploadStringToIncoming(data); _queue.PushMessageToPostQueue(blob.Uri.ToString()); return new HttpStatusCodeResult(200); } ... } }

Instrument Cluster

Much of this is also covered in our previously released OBD-II library where the instrument cluster application is included as a sample. This is a WPF application that runs on a Windows 7 slate. It contains three different skins designed by 352 Mediaa 2012 Mustang dashboard, a 1967 Mustang dashboard, and a Metro-style dashboardeach of which can be "swiped" through. This application queries the OBD-II WCF service described above as quickly as it can to retrieve speed, RPM, fuel level, and other data for display to the driver. The gauges are updated in real-time just as a real dashboard instrument cluster would behave.


The HUD (or Heads Up Display) application runs on one of the two Windows 7 computers in the car. This is a full-screen application that is output via a projector to a series of mirrors and a projection screen. This is then reflected onto the front glass of the windshield of the car. To install these, we altered the physical car's body and created brackets to mount mirrors and the projectors. In the picture on the left, you can see the dashboard's structural member pivoted outward. You can see the 12" section we removed and added in the base plate to allow light to be reflected through to the windshield. Bill Steele helped design and implement the physical HUD aspect into the car.

The HUD application has several different modes. The mode is selected from the Windows Phone application.

POI / Mapping This uses Bing Maps services. The phone or Windows 8 passenger application can choose one of a select group of categories (Eat, Entertain, Shop, Gas). Once selected, the REST service layer is contacted and the current choice is persisted. The HUD is constantly polling the service to know what the current category is, and when it changes, the HUD switches to an overhead map display with the closest locations of that category displayed, along with your always updated current GPS position and direction. The list of closest items in the category is requested every few seconds from the Bing Maps API and the map is updated appropriately.
Car telemetry - In the car telemetry mode, the OBD data from the WCF service described above is queried and displayed on the screen. This can be though of as an overall car "status" display with the speed, RPMs, real-time MPG, time, and weather information.Weather We use the World Weather Online API to get weather data for display on the HUD. This API allows queries for weather based on a latitude and longitude, which we have at all times. A quick call to the service gives us the current temperature and a general weather forecast, which we display as an icon next to the temperature in the lower-left portion of the screen.
Kinect Using our Kinect Service, with the standard WPF client code, we can display the rear camera on the HUD to help the driver when backing up. See the Kinect Service project for more information on how this works and to use the service in an application of your own.
Windows Phone Application

One of the main ways to control the vehicle is through the Windows Phone application.

The first pivot of the app allows the user to lock, unlock, start the car, and set off the alarm. This is done through the Viper product from Directed Electronics.

The second pivot contains the remaining ways that a user can interact with the car.

Kinect This uses the Kinect service much in the way the HUD does. It can display both the front and rear cameras as well as allow the user to listen to an audio clip and send it up to the car while applying a voice changing effect.
Voice Effect When the Talk button is pressed, the user can record their voice via the microphone. When released, the audio data is packaged in a simple WAV file and uploaded to the REST service. The user can select from several voice effects, such as Chipmunk and Deep. On the service side, that WAV file is modified with the selected effect and then played through the PA system. The code in this section of the app is very similar to the Coding4Fun Skype Voice Changer. We use NAudio and several pre-made effects to process the WAV file for play.Lighting This controls the external lighting for the car. The user can select a zone, an animation, and a color to apply. Once selected, this is communicated through the REST service to the lighting controller.
Messaging This presents a list of known pictures and videos for the user. The selection is sent to the car through the REST service and displayed on the projector that is pointed at the rear window, allowing following drivers to see the image, video, or message.Point Of Interest As described earlier, this is the way the user can turn on the Point of Interest map on the HUD. Selecting one of the four items sends the selection to the REST service where it is persisted. The polling HUD will know when the selection is changed and display the map interface as shown above.Telemetry This is a replica of the instrument cluster that runs on the Windows 7 slate. OBD data is queried via the WCF service, just like the slate, and displayed on the gauges, just like the slate.
Projection Screen This will raise and lower the projection screen on the rear of the car.Horn This displays a list of known horn sound effects that live on the REST service layer. Selecting any of the items will send a command through the REST to play that sound file on the external sound system of the car. This selected audio file would play when the horn was pressed in the car.Settings Internal settings for setting up hardware and software for the car.
Passenger Application

The passenger interface runs on a Samsung Series 7 slate running the Windows 8 Consumer Preview. This interface has a subset of the functionality provided by the Windows Phone application, but communicates through the same REST service. From this interface, the passenger can set the car horn sound effect, view the front and back Kinect cameras, select a Point of Interest category to be displayed on the HUD, and select the image, video or message that will be displayed on the rear window.

External Car Lighting

The external lighting system was controlled by a web server running on a Netduino Plus using a Sparkfun protoshield board to simplify wiring, and allow for another shield to be used. The actual lights were Digital Addressable RGB LED w/ PWM. We'll also have a more in-depth article on this system on Coding4Fun shortly.
The car is broken down into different zonesgrill, wheels, vents, etc. It also has a bunch of pre-defined procedural animation patterns that have a few adjustable parameters that allow for things like a snake effect, a sensor sweep, or even a police pattern. Each zone has its own thread which provides the ability to have multiple animation patterns going at the same time. When a command is received, the color, pattern, zone, and other data is then processed.
Here is a basic animation loop pattern.

private static void RandomAnimationWorker() { var leds = GetLedsToIlluminate(); var dataCopy = _data; var r = new Random(); while (IsThreadSignaledToBeAlive(dataCopy.LightingZone)) { for (var i = 0; i < leds.Length; i++) SetLed(leds[i], r.Next(255), r.Next(255), r.Next(255)); LedRefresh(); Thread.Sleep(dataCopy.TickDuration); } } Rear Projection Window

The rear projection system consists of two 4 linear actuators, a linear actual controller, the NETMF web server from above, a Seeed Studio Relay Shield, the back glass of a 1967 Ford Mustang, some rear projection film, a low profile yet insanely bright projector that accepts serial port commands, and a standard USB to serial adapter.
The REST service layer toggles the input of the projector based on the selected state. This would allow us to go from the HDMI output of an Xbox 360 to the VGA output of the laptop. While doing this, the REST layer sends a command to the NETMF web server to either raise or lower the actuators.
Here is the code for the NETMF to control the raising and lowering the glass:

public static class Relay { // code for for Electronic Brick Relay Shield static readonly OutputPort RaisePort = new OutputPort(Pins.GPIO_PIN_D5, false); static readonly OutputPort LowerPort = new OutputPort(Pins.GPIO_PIN_D4, false); const int OpenCloseDelay = 1000; public static bool Raise() { return ExecuteRelay(RaisePort); } public static bool Lower() { return ExecuteRelay(LowerPort); } private static bool ExecuteRelay(OutputPort port) { port.Write(true); Thread.Sleep(OpenCloseDelay); port.Write(false); return true; } } Messaging System

This is a WPF application that leveraged the file system on the computer to communicate between the REST service layer and itself. Visually, it shows the message/image/video in the rear view mirror but it actually does two other tasks, it operates our car horn system and plays the recorded audio output from the phone.

Displaying Messages
To display images, we poll the REST service every second for an update. Depending on the return type, we either display a TextBlock element or a MediaElement.Detecting and Playing Car Horn
When someone presses the horn in the car, it is detected by a Phidget 8/8/8 wired into a Digital Input. In-between the car horn and the Phidget, there is a relay as well. This isolates the voltage coming from the horn and solves a grounding issue. We then feed back two wires from that relay and put one into the ground and the other into one of the digital inputs. In the application, we listen to the InputChange event on the Phidget and play / stop the audio based on the state.Detecting new recorded audio from the phone and car horn changes
When someone talks into the phone or selects a new car horn, the REST service layer places that audio file into a predetermined directory. The Messaging service then uses a FileSystemWatcher to detect when this file is added. The difference between the car horn detection and recorded audio is the recorded audio will play once it is done writing to the file system.
External PA System

To interact with people, we installed an external audio PA or Public Address system. This system is hooked into the laptop that is connected to the car horn, and can play audio data from the phone. Having a PA system that is as simple as an audio jack that plugs into a PC enabled us to have different ringtones for the car horn and to talk through the car using Windows Phone.

After months of planning and building, the Project Detroit car was shown to the world on an episode of Inside West Coast Customs. Though it was a ton of work, the end product is something we are all proud of. We hope that this project inspires other developers to think outside the box and realize what can be done with some off-the-shelf hardware, software, and passion.



In early January, we were tasked with creating a unique, interactive experience for the SXSW Interactive launch party with Frog Design. We bounced around many ideas, and finally settled on a project that Rick suggested during our first meeting: boxing robots controlled via Kinect.
The theme of the opening party was Retro Gaming, so we figured creating a life size version of a classic tabletop boxing game mashed up with a "Real Steel"-inspired Kinect experience would be a perfect fit. Most importantly, since this was going to be the first big project of the new Coding4Fun team, we wanted to push ourselves to create an experience that needed each of us to bring our unique blend of hardware, software, and interaction magic to the table under an aggressively tight deadline.

The BoxingBots had to be fit a few requirements:

They had to be funThey had to survive for 4 hours, the length of the SXSW opening partyEach robot had to punch for 90 seconds at a time, the length of a roundThey had to be life-sizeThey had to be Kinect-drivableThey had to be built, shipped, and reassembled for SXSW
Creating a robot that could be beaten up for 4 hours and still work proved to be an interesting problem. After doing some research on different configurations and styles, it was decided we should leverage a prior project to get a jump start to meet the deadline. We repurposed sections of our Kinect drivable lounge chair, Jellybean! This was an advantage because it contained many known items, such as the motors, motor controllers, and chassis material. Additionally, it was strong and fast, it was modular, and the code to drive it was already written.
Jellybean would only get us part of the way there, however. We also had to do some retrofitting to get it to work for our new project. The footprint of the base needed to shrink from 32x50 inches to 32x35 inches, while still allowing space to contain all of the original batteries, wheels, motors, motor controllers, switches, voltage adapters. We also had to change how the motors were mounted with this new layout, as well as provide for a way to easily "hot swap" the batteries out during the event. Finally, we had to mount an upper body section that looked somewhat human, complete with a head and punching arms.

Experimenting with possible layouts
The upper body had its own challenges, as it had to support a ton of equipment, including:

Punching armsPopping headPneumatic valvesAir manifoldAir Tank(s)LaptopPhidget interface boardPhidget relay boardsPhidget LED boardXbox wireless controller PC transmitter / receiverChest plateLEDsSensors to detect a punch

Brian and Rick put together one of the upper frames
Punching and Air Tanks

We had to solve the problem of getting each robot to punch hard enough to register a hit on the opponent bot while not breaking the opponent bot (or itself). Bots also had to withstand a bit of side load in case the arms got tangled or took a side blow. Pneumatic actuators provided us with a lot of flexibility over hydraulics or an electrical solution since they are fast, come in tons of variations, won't break when met with resistance, and can fine tuned with a few onsite adjustments.
To provide power to the actuators, the robots had two 2.5 gallon tanks pressurized to 150psi, with the actuators punching at ~70psi. We could punch for about five 90-second rounds before needing to re-pressurize the tanks. Pressurizing the onboard tanks was taken care of by a pair of off-the-shelf DeWalt air compressors.

The Head

It wouldnt be a polished game if the head didnt pop up on the losing bot, so we added another pneumatic actuator to raise and lower the head, and some extra red and blue LEDs. This pneumatic is housed in the chest of the robot and is triggered only when the game has ended.
To create the head, we first prototyped a concept with cardboard and duct tape. A rotated welding mask just happened to provide the shape we were going for on the crown, and we crafted each custom jaw with a laser cutter. We considered using a mold and vacuum forming to create something a bit more custom, but had to scrap the idea due to time constraints.


Our initial implementation for detecting punches failed due to far too many false positives. We thought using IR distance sensors would be a good solution since we could detect a close punch and tell the other robot to retract the arm before real contact. The test looked promising, but in practice, when the opposite sensors were close, we saw a lot of noise in the data. The backup and currently implemented solution was to install simple push switches in the chest and detect when those are clicked by the chest plate pressing against them.


Different items required different voltages. The motors and pneumatic valves required 24V, the LEDs required 12V and the USB hub required 5V. We used Castle Pro BEC converters to step down the voltages. These devices are typically used in RC airplanes and helicopters.

So how does someone ship two 700lb robots from Seattle to Austin? We did it in 8 crates. . The key thing to note is that the tops and bottoms of each robot were separated. Any wire that connected the two parts had to be able to be disconnected in some form. This affected the serial cords and the power cords (5V, 12V, 24V).

The software and architecture went through a variety of iterations during development. The final architecture used 3 laptops, 2 desktops, an access point, and a router. It's important to note that the laptops of Robot 1 and Robot 2 are physically mounted on the backs of each Robot body, communicating through WiFi to the Admin console. The entire setup looks like the following diagram:

Admin Console

The heart of the infrastructure is the Admin Console. Originally, this was also intended to be a scoreboard to show audience members the current stats of the match, but as we got further into the project, we realized this wouldn't be necessary. The robots are where the action is, and people's eyes focus there. Additionally, the robots themselves display their current health status via LEDs, so duplicating this information isn't useful. However, the admin side of this app remains.

The admin console is the master controller for the game state and utilizes socket communication between it, the robots, and the user consoles. A generic socket handler was written to span each computer in the setup. The SocketListener object allows for incoming connections to be received, while the SocketClient allows clients to connect to those SocketListeners. These are generic objects, which must specify objects of type GamePacket to send and receive:

public class SocketListener where TSend : GamePacket where TReceive : GamePacket, new()

GamePacket is a base class from which specific packets inherit:

public abstract class GamePacket{ public byte[] ToByteArray() { MemoryStream ms = new MemoryStream(); BinaryWriter bw = new BinaryWriter(ms); try { WritePacket(bw); } catch(IOException ex) { Debug.WriteLine("Error writing packet: " + ex); } return ms.ToArray(); } public void FromBinaryReader(BinaryReader br) { try { ReadPacket(br); } catch(IOException ex) { Debug.WriteLine("Error reading packet: " + ex); } } public abstract void WritePacket(BinaryWriter bw); public abstract void ReadPacket(BinaryReader br);}For example, in communication between the robots and the admin console, GameStatePacket and MovementDescriptorPacket are sent and received. Each GamePacket must implement its own ReadPacket and WritePacket methods to serialize itself for sending across the socket.
Packets are sent between machines every "frame". We need the absolute latest game state, robot movement, etc. at all times to ensure the game is functional and responsive.

As is quite obvious, absolutely no effort was put into making the console "pretty". This is never seen by the end users and just needs to be functional. Once the robot software and the user consoles are started, the admin console initiates connections to each of those four machines. Each machine runs the SocketListener side of the socket code, while the Admin console creates four SocketClient objects to connect to each those. Once connected, the admin has control of the game and can start, stop, pause, and reset a match by sending the appropriate packets to everyone that is connected.

The robot UI is also never intended to be seen by an end user, and therefore contains only diagnostic information.

Each robot has a wireless Xbox 360 controller connected to it so it can be manually controlled. The UI above reflects the positions of the controller sticks and buttons. During a match, it's possible for a bot to get outside of our "safe zone". One bot might be pushing the other, or the user may be moving the bot toward the edge of the ring. To counter this, the player's coach can either temporarily move the bot, turning off Kinect input, or force the game into "referee mode" which pauses the entire match and turns off Kinect control on both sides. In either case, the robots can be driven with the controllers and reset to safe positions. Once both coaches signal that the robots are reset, the admin can then resume the match.
Controlling Hardware

Phidget hardware controlled our LEDs, relays, and sensors. Getting data out of a Phidget along with actions, such as opening and closing a relay, is shockingly easy as they have pretty straightforward C# APIs and samples, which is why they typically are our go-to product for projects like this.
Below are some code snippets for the LEDs, relays, and sensor.
LEDs from LedController.cs
This is the code that actually updates the health LEDs in the robot's chest. The LEDs were put on the board in a certain order to allow this style of iteration. We had a small issue of running out of one color of LEDs so we used some super bright ones and had to reduce the power levels to the non-super bright LEDs to prevent possible damage:

private void UpdateLedsNonSuperBright(int amount, int offset, int brightness){ for (var i = offset; i < amount + offset; i++) { _phidgetLed.leds[i] = brightness / 2; }}private void UpdateLedsSuperBright(int amount, int offset, int brightness){ for (var i = offset; i < amount + offset; i++) { _phidgetLed.leds[i] = brightness; }}
Sensor data from SensorController.cs
This code snippet shows how we obtain the digital and analog inputs from the Phidget 8/8/8 interface board:

public SensorController(InterfaceKit phidgetInterfaceKit) : base(phidgetInterfaceKit){ PhidgetInterfaceKit.ratiometric = true;}public int PollAnalogInput(int index){ return PhidgetInterfaceKit.sensors[index].Value;}public bool PollDigitalInput(int index){ return PhidgetInterfaceKit.inputs[index];}
Relays from RelayController.cs
Electrical relays fire our pneumatic valves. These control the head popping and the arms punching. For our application, we wanted the ability to reset the relay automatically. When the relay is opened, an event is triggered and we create an actively polled thread to validate whether we should close the relay. The reason why we actively poll is someone could be quickly toggling the relay. We wouldn't want to close it on accident. The polling and logic does result in a possible delay or early trigger for closing the relay, but for the BoxingBots the difference of 10ms for a relay closing is acceptable:

public void Open(int index, int autoCloseDelay){ UseRelay(index, true, autoCloseDelay);}public void Close(int index){ UseRelay(index, false, 0);}private void UseRelay(int index, bool openRelay, int autoCloseDelay){ AlterTimeDelay(index, autoCloseDelay); PhidgetInterfaceKit.outputs[index] = openRelay;}void _relayController_OutputChange(object sender, OutputChangeEventArgs e){ // closed if (!e.Value) return; ThreadPool.QueueUserWorkItem(state => { if (_timeDelays.ContainsKey(e.Index)) { while (_timeDelays[e.Index] > 0) { Thread.Sleep(ThreadTick); _timeDelays[e.Index] -= ThreadTick; } } Close(e.Index); });}public int GetTimeDelay(int index){ if (!_timeDelays.ContainsKey(index)) return 0; return _timeDelays[index];}public void AlterTimeDelay(int index, int autoCloseDelay){ _timeDelays[index] = autoCloseDelay;}
User Console

Since the theme of the party was Retro Gaming, we wanted to go for an early 80's Sci-fi style interface, complete with starscape background and solar flares! We wanted to create actual interactive elements, though, to maintain the green phosphor look of early monochrome monitors. Unlike traditional video games, however, the screens are designed not as the primary focus of attention, but rather to help calibrate the player before the round and provide secondary display data during the match. The player should primarily stay focused on the boxer during the match, so the interface is designed to sit under the players view line and serve as more of a dashboard during each match.
However, during calibration before each round, it is important to have the player understand how their core body will be used to drive the Robot base during each round. To do this, we needed to track an average of the joints that make up each fighter's body core. We handled the process by creating a list of core joints and a variable that normalizes the metric distances returned from the Kinect sensor into a human-acceptable range of motion:

private static List coreJoints = newList( newJointType[] { JointType.AnkleLeft, JointType.AnkleRight, JointType.ShoulderCenter, JointType.HipCenter });private const double RangeNormalizer = .22;private const double NoiseClip = .05;And then during each skeleton calculation called by the game loop, we average the core positions to determine the averages of the players as they relate to their playable ring boundary:

public staticMovementDescriptorPacket AnalyzeSkeleton(Skeleton skeleton){ // ... CoreAverageDelta.X = 0.0; CoreAverageDelta.Z = 0.0; foreach (JointType jt in CoreJoints) { CoreAverageDelta.X += skeleton.Joints[jt].Position.X - RingCenter.X; CoreAverageDelta.Z += skeleton.Joints[jt].Position.Z - RingCenter.Z; } CoreAverageDelta.X /= CoreJoints.Count * RangeNormalizer; CoreAverageDelta.Z /= CoreJoints.Count * RangeNormalizer; // ... if (CoreAverageDelta.Z > NoiseClip || CoreAverageDelta.Z < -NoiseClip) { packet.Move = -CoreAverageDelta.Z; } if (CoreAverageDelta.X > NoiseClip || CoreAverageDelta.X < -NoiseClip) { packet.Strafe = CoreAverageDelta.X; }}In this way, we filter out insignificant data noise and allow the player's average core body to serve as a joystick for driving the robot around. Allowing them to lean at any angle, the move and strafe values are accordingly set to allow for a full 360 degrees of movement freedom, while at the same time not allowing any one joint to unevenly influence their direction of motion.
Another snippet of code that may be of interest is the WPF3D rendering we used to visualize the skeleton. Since the Kinect returns joint data based off of a center point, it is relatively easy to wire up a working 3D model in WPF3D off of the skeleton data, and we do this in the ringAvatar.xaml control.
In the XAML, we simply need a basic Viewport3D with camera, lights, and an empty ModelVisual3D container to hold or squares. The empty container looks like this:

In the code, we created a generic WPF3DModel that inherits from UIElement3D and is used to store the basic positioning properties of each square. In the constructor of the object, though, we can pass a reference key to a XAML file that defines the 3D mesh to use:

public WPF3DModel(string resourceKey){ this.Visual3DModel = Application.Current.Resources[resourceKey] as Model3DGroup;}This is a handy trick when you need to do a fast WPF3D demo and require a certain level of flexibility. To create a 3D cube for each joint when ringAvatar is initialized, we simply do this:

private readonly List _models = new List();private void CreateViewportModels(){ for (int i = 0; i < 20; i++) { WPF3DModel model = new WPF3DModel("mesh_cube"); viewportModelsContainer2.Children.Add(model); // ... _models.Add(model); } // ...}And then each time we need to redraw the skeleton, we loop through the skeleton data and set the cube position like so:

if (SkeletonProcessor.RawSkeleton.TrackingState == SkeletonTrackingState.Tracked){ int i = 0; foreach (Joint joint in SkeletonProcessor.RawSkeleton.Joints) { if (joint.TrackingState == JointTrackingState.Tracked) { _models[i].Translate( joint.Position.X * 8.0, joint.Position.Y * 10.0, joint.Position.Z * -10.0); i++; } } // ...}There are a few other areas in the User Console that you may want to further dig into, including the weighting for handling a punch as well dynamically generating arcs based on the position of the fist to the shoulder. However, for this experience, the User Console serves as a secondary display to support the playing experience and gives both the player and audience a visual anchor for the game.
Making a 700lb Tank Drive like a First Person Shooter

The character in a first person shooter (FPS) video game has an X position, a Y position, and a rotation vector. On an Xbox controller, the left stick controls the X,Y position. Y is the throttle (forward and backward), X is the strafing amount (left and right), and the right thumb stick moves the camera to change what you're looking at (rotation). When all three are combined, the character can do things such as run around someone while looking at them.
In the prior project, we had existing code that worked for controlling all 4 motors at the same time, working much like a tank does, so we only had throttle (forward and back) and strafing (left and right). Accordingly, we can move the motors in all directions, but there are still scenarios in which the wheels fight one another and the base won't move. By moving to a FPS style, we eliminate the ability to move the wheels in an non-productive way and actually make it a lot easier to drive.
Note that Clint had some wiring "quirks" with polarity and which motor was left VS right, he had to correct in these quirks in software :

public Speed CalculateSpeed(double throttleVector, double strafeVector, double rotationAngle){ rotationAngle = VerifyLegalValues(rotationAngle); rotationAngle = AdjustValueForDeadzone(rotationAngle, AllowedRotationAngle, _negatedAllowedRotationAngle); // flipped wiring, easy fix is here throttleVector *= -1; rotationAngle *= -1; // miss wired, had to flip throttle and straff for calc return CalculateSpeed(strafeVector + rotationAngle, throttleVector, strafeVector - rotationAngle, throttleVector);}protected Speed CalculateSpeed(double leftSideThrottle, double leftSideVectorMultiplier, double rightSideThrottle, double rightSideVectorMultiplier) { /* code from Jellybean */ }Conclusion

The Boxing Bots project was one of the biggest things we have built to date. It was also one of our most successful projects. Though it was a rainy, cold day and night in Austin when the bots were revealed, and we had to move locations several times during setup to ensure the bots and computers wouldn't be fried by the rain, they ran flawlessly for the entire event and contestants seemed to have a lot of fun driving them.


Page 1 of 2.
Results 1...20 of 28