This weekend I went to Tata Communications Future of Collaboration hackathon at nestGSV in Redwood City. With such a broad category of making collaboration easier, it was hard to narrow down what our team wanted to build in less than 18 hours.
While enjoying a delicious dinner from SF Whisk, our team started brainstorming.
What if you had access to a remote team of experts that you could ask questions or form longer term relationships for services like legal and regulatory tasks.
AT&T held another hackathon this weekend at the AT&T Foundry in Palo Alto. This one was focused on public safety and making dangerous situations safer using technology.
On Friday night, a number of law enforcement professionals from across the country came to talk about what life is like being an officer and protecting our communities. It was a gold mine of ideas that should have motivated everyone to step it up to the next level.
As I blogged about on Friday afternoon, there are so many things that can go wrong when responding to an emergency. I didn’t even touch on what these guys said. By the end of the evening I can honestly say I felt defeated. With all my skills and motivation, public safety is a really hard problem to solve. Anything I could create in 24 hours would barely touch what we as a country and the world need to solve. It’s a never-ending battle. With this hackathon, though, we can start small and make a difference.
Some things that really got to me was the example where the officer making a traffic stop has to keep a visual on the suspect. One second looking down at a computer screen is enough time for a suspect to fire a bullet at the officer. Wow, when you put it that way, we have to change the way an officer gets information. A heads-up display can be used to provide the officer valuable information. Is that why officers and dispatchers still communicate license numbers over the air?
Another example was license plate recognition software. If a vehicle is fleeing the scene of a crime, the plate can be put into a system that connects to other cities automaticall. Other patrol cars are automatically scanning every license number it can see as it drives down the street. It just so happens a patrol car in another city captures the license plate that matches the fleeing vehicle. The officer is alerted that this vehicle is connected to a crime and can proceed to do a traffic stop. You have taken a moving needle in a ever changing haystack and making it super simple and automatic to locate the needle. That was impressive.
With many ideas and so few hours to code, it was time to get down to business.
I’m sitting here with so many ideas spinning around in my head for the AT&T Public Safety hackathon starting tonight. And it’s hitting me harder than I expected. My heroes need my help.
My heroes, those men and women who respond to emergencies and run toward what everyone else runs away from, need my help. They are helpless to whatever they are exposed to. Explosions, toxic chemicals, injuries, unforeseen situations that can take their life away in a split second.
Sure, they are trained for common situations and don’t blindly go into unsafe situations. But there are people in this world who take pleasure in seeing these men and women, my heroes, for lack of a humane word, die. And these men and women, MY heroes, who protect ME, MY family, MY neighbors and MY city don’t deserve that. They have families, husbands, wives, boyfriends, girlfriends, sons, daughters, grandchildren, and pets waiting for their hugs and kisses when they go home.
If you’re not infuriated by now like I am, you should be.
I grew up playing with firetrucks, fighting imaginary fires with my Lincoln Log houses, punching holes in the roof like I watched my heroes do in real life. Looking back, I guess I assumed the families and pets were safely out of the house by the time my toy firetruck arrived on the scene. My toy houses would be left in disrepair, but the fire was out. Then I would proceed with my next childhood interest, construction, and repair the houses so the families could move back in. Yes, I was a creative child.
I could have been a firefighter if the Internet didn’t distract me in my teenage years. Since I was a young child, I have always looked out the window for or at a passing emergency vehicle. And my adult heart always sinks when I hear the sirens. Not just for the person(s) they are responding to, but for my heroes in that vehicle.
They have NO IDEA what they’re racing towards.
With 9/11, there are no words to describe what happened before, during, and after. Lack of communication channels that efficiently managed the real-time information, lack of situational awareness for everyone involved, and lack of advanced techniques to problem solve fluid and dynamic problems (but really, how much can you prepare for in this type of situation) were some of the reasons my heroes lost their life.
Recently, the San Bruno gas line explosion, Boston marathon explosions and the fertilizer plant explosion in Texas, are events that have had similar problems for my heroes . They are human and can only do so much. But technology can help. All those pictures and videos taken in Boston, with the help of technology, did a great job in the quick investigation. The people finder/I’m safe websites that pop up help locate people after the event. The I have a sofa websites that pop up to offer a displaced and traumatized person a place to sleep for the night make these tragic events a little more humane.
Coming back to the hackathon tonight and why this is affecting me. I have the power to help my heroes use technology to do an even better job. Let me clarify, they already do a damn good job. I have no complaints for how my heroes do their job. I’m actually speechless on why my heroes do a damn good job. They just do.
I have complaints for what “others” do to my heroes .
If I can help just one hero make it out of a disaster safely, they could be the one carrying ME, MY loved one, or MY neighbor (who I may not even talk with) out with them. This world works in mysterious ways, ways that boggle my mind. And I have the ability to help them. I work with technology that can make my hero’s life easier in these stressful situations.
So what am I planning to work on this weekend? I want to work on all three problems I mentioned above. But that’s a lot to tackle in 24 hours. We need to start somewhere and keep at this until…well, never stop. Yes, we can never prevent bad things and bad days from happening. But we can help my heroes when they do. We don’t need anymore really bad days.
Innospring hosted a hackathon with Evernote at their office on Tasman Drive in Santa Clara and Shanghai this past weekend.
Ty from Evernote introduced their API and a suite of apps in their Trunk (aka their store). I had heard of them several years ago but was never into storing my data in the cloud at that time. Oh how times have changed.
Always on the lookout for new APIs to integrate into my existing projects, especially Read With Me App, I was interested in playing around with Evernote. Some questions I had about reading text from images in Evernote didn’t quite match with the expectations I was hoping for, so I had to scrap my first idea. From my understanding, the API gives an indication whether search queries are found in the images and the location of those words within the image, but it doesn’t return text as a whole like an OCR engine does. I want to research if there’s another way or if there’s might be a hack I can write.
My idea for this weekend started with using Evernotes and your location. My more ambitious vision was walking into a room and seeing Evernotes “posted” on walls using Augmented Reality and Google Glass. Not something that can be done in 40 hours (and with sleep!).
I simplified my vision to buckets in specific locations. A location could have multiple buckets of varying radius. If you enter the radius, you get an alert (a timeline card in Glass) that there are Evernotes available. You step out of that circle, and you don’t see them anymore.
This was relatively simple to code. I created a mobile screen to attach Evernotes to a bucket. I created a couple of buckets and attached Evernotes into them.
But the thing I really wanted to include was Google Glass. Having seen Glass a couple times this past week, it’s starting to grow on me. If I can go about my day and get alerts like my cellphone does with email, but instead have a visual interface like Glass offers, where I don’t have to dedicate one or both of my hands to the activity of holding a phone, this could really change the ease of getting immediate information based on my location.
The UI was simple for Glass. The name of the bucket, the names of a couple of Evernotes, and maybe the event info. As I move around, Glass will use my location to look for buckets (or hotspots) of Evernotes near me and trigger timeline cards to be displayed.
On Saturday, I went over to the Microsoft campus to learn about Node.js, a lightweight solution to serving requests on a webserver.
One of the features of Node.js is the ease of using sockets. Because sockets leave the connection open between the client (browser) and the server, sending little bits of data back and forth is quick and requires less resources than opening a connection for each request.
After a morning of a quick introduction to Node.js, we had an opportunity to get our fingers coding.