Recent News

My second story Borbaton is now up relating the near future of a man sharing his first experiences with semi-autonomous drone technology that aids in bird identification.




Sunday, July 13, 2025

Borbaton

It was a kickstarter post on my Facebook feed where I first saw the clownish product name of Borbaton. So many random products had hit our feeds over the years that I scarcely even noticed it amongst the 80% ads that now propagated my daily view on the platform.

 


It looked like a floating droid ball from a science-fiction movie that might move about and shoot lasers at enemies. AI driven content though had long since made it nearly impossible to tell reality from fantasy. I lingered on the post though curious at the least what the name even meant. 


Bird + Orb + Automaton = Borbaton


The drone-like device promised an impressive array of sensors and capabilities that were meant to aid the owner in finding and identifying more species of birds while in the field. The ad even claimed the product was live and their project only needed start-up capital to produce the first lot of them for field testing by backers. Typical language of, “spots limited”, “the future is now”, and “birding will never be the same again” were plastered across the video sequence showing a user controlling the Borbaton setup via smartphone and voice command.


It was mesmerizing as one user drove up to a wetland and simply held the device out the window and commanded it to scan the entire wetland area for uncommon to rare species. It lifted off from their palm in silence and jetted up to about 100’ feet in the air. Overlaid text and audio indicated the device would create a map of the space to cover and then begin using visual, audio, and thermal scanning to map prospective species. Tools were noted to include extensive machine learning models for visual and audio based identification, which would be reported back to the users phone app. Device based advances were indicated to be nearly silent propulsion and hovering modes allowing near zero disturbance to birds. 


The video shifted to the still seated driver reviewing a phone app that was beginning to show muted listings of common birds in the marsh. The Borbaton hadn’t yet dropped altitude to start scouring the area when a yellow flagged species popped up on the screen indicating a 75% confidence rating in the presence of a Common Gallinule with audio being the initial source for identification. The camera shifted view to the device now descending and silently moving towards the back corner of the marsh space. A few seconds later it sat frozen in air as a chime could be heard from the phone. Panning back to the phone screen showed a video icon next to the flagged record now showing 100% certainty with audio and video proof being noted. 


The user tapped the screen and up popped a live video feed showing a once hidden Common Gallinule slowly paddling along a line of cat tails. The audio feed filled the vehicle cabin with calling Red-winged Blackbirds and the soft grunts of the Gallinule that was much to far away to be heard by ear or without the aid of this particular device. 


The driver spoke again asking the device to continue scanning the area. The advertisement wrapped up with a splash screen indicating 500 spots available for the pilot program and a hefty $8,000 price tag. It would be a $4,000,000 dollar startup with promises of advanced software updates for early adopters to bring on further capabilities in the future. A short list included full autonomous mode, deep scanning, nocturnal analysis, and something referred to as hive mind. 


I pressed the replay option and watched it all again, fascinated by what might be if this were not just a cash grab scam targeted to the birding community. A day later I still hadn’t made up my mind, but returned to the bookmarked web link for the starter that now indicated less than 50 spots remaining with an estimate of only a few hours before it would likely close. I had done some research and found that the project team comprised a couple former Cornell alums and an MIT friend that developed the engineering around the drive mechanism and silent operation features. Beyond the fact it was almost to good to be true, it really did appear to be an honest attempt at bringing autonomous drone technology to bird watching. It would be the next step beyond having listening devices that pull in lists of birds all day and night from a stationary position. 


Little did I realize that upon taking this 8K leap of faith I would enter a world of bird watching I never could have imagined. My own field mobility had waned several years whence, though I still greatly enjoyed getting to the more remote locations. Immersing myself in the sounds of the wild was a joy that helped reduce the isolation and loneliness that permeated most of my days now. My hearing had dulled on one side considerably and back issues finally found me after decades of long hikes. I figured I was in better shape than many my age, but each year a little bit more was being taken away. I longed to somehow stave off time itself for even just a few more years before I might be rendered mostly sedentary. I knew the bell would toll for me one day and all I sought was to continue my long friendship with the birds in any way that I could. In that light the money would either buy some small measure of what I sought or be a lottery ticket dream that simply didn’t win the day. I was fine with either outcome honestly, but I wasn’t prepared for what came next. 


- - -


Weeks stretched into months and despite consistent social media posts I began to lose faith a product would arrive. My assumption was that I had in fact purchased vapor ware and nothing more. In the meantime I had begun reviewing the source code that had been posted to a repo online for the project. It appeared that the developers for some reason decided to open source their project code, perhaps with the intent that it would be improved by the user community during testing. That seemed a stretch though considering how few birders were actually programming experts. The documentation page gave even fewer clues as it could scarcely even be understood that the code was intended to run on the borbaton platform at all. I had found the repository only after a cryptic social media message led me to an unrelated users profile with a truncated URL link. I cloned the code and moved a copy to my cold storage drive assuming it would be a way to recover any future device should the local storage fail. Though I never really thought a device would materialize in all honesty.    


Sometime in the deep of winter word went out that devices would ship in the coming weeks once they cleared approval from the government. During production new regulations went into effect that instituted required testing and licensing protocols for any type of autonomous drone technology that would be fitted with AI pilot routines. Though it was boring reading, the regulations implied that drone technology would not be allowed to run on a software platform fully controlled by artificial intelligence. Everything made me wonder if the end result would be a hamstrung device capable of much, but not all that was promised. I would of course take what I could get if it meant a flight capable device with near silent operation that was still capable of finding birds and identifying them using existing machine learning models for audio, video, and images. 


As the wait continued to stretch into additional months I recognized in the codebase I cloned from the now defunct repository that the pilot core actually appeared to be based upon a modified autonomous vehicle. I quickly concluded that the codebase that had been shared was in fact the original intended code for the platform and that I likely had pulled one of a very limited number of copies. When I performed my pull the website had noted just 6 users total. What all of this meant wasn’t clear beyond the reality that it was seemingly gone along with the user account that had posted the original link to the code repository itself. The developers social media account had been scrubbed of the reference to the now missing user account and in place was now a post indicating a direct link to the open source code was available. 


The two code bases were light years apart from one another and presented what I originally thought we’d see. A hamstrung device platform with limited AI and simple search mechanisms that would likely go off in scripted methods and routes over a defined space. From what I could tell it was still pretty impressive, but nothing compared to the advertisement seen on Facebook. Oddly even that video and ad was now missing from every location I searched. It was as if the origins of the project had been scrubbed completely from the web during the production approval cycle.  


Just like that though a large box arrived via signature delivery in the early spring. It seemed comically large for a device that would fit into the palm of my hand. I had to open it on the front walkway, remove the contents, and break down the box to get it in the door. There it sat though on my coffee table in a black three-point stand held like a crystal ball covered in a fine coated mesh. When I placed it down the weight of it felt substantial for something that was to hover and fly as if defying gravity itself. For the first time in ages I felt those prickles of anticipation and excitement of a new toy a child might feel on the morning of a birthday. The one sheet color guide was really just a QR code to open the real instruction manual on their website that I had already covered a few times over so I ignored it outright. 


I had already installed the core interfacing application on my phone several weeks prior and at this point was ready to go after device pairing was completed. Proximity alone was enough for that and lights began to flicker along with a quick hover check as it gently and nearly silently rose off the stand. After a few rotations the borb settled back to the base, ran another series of checks and popped validation content streams up on my phone for audio, video, and thermal imaging. Device audio asked me to stand up and move about for follow mode. I stood up slowly, feeling my age in the process, and shuffled slowly across the room. As I grasped for a handhold a few times, the Borb moved towards the ceiling and slowly trailed my position while streaming video of the entire sequence of my movement around the open living space of my home.


I wrapped up in home testing with my mind spinning from what I was experiencing even without birds being included. I realized that upon finding a rare bird that a Borb could be sent out to monitor its location as long as it had charge. The ramifications for chase birding were pretty wild with the potential for real time updates to be streamed to other reporting platforms while the finder was comfortable in the car or out bird watching in other spaces beyond the target bird.  I imagined even protected species could literally have a Borb keeping an eye on them and holding aggressive birders or photographers at bay when necessary. I needed to begin field work immediately.

 

- - -


Just a few feet from the parking lot was the first wetland and scrubby wooded space I wanted to try out the Borb. I’d known this space and the park itself for some 35 years now. It typically had nothing of great interest, but as a control I felt like I knew the space pretty well. Trails ringed the edge of the woodland buffer with a few incursion points allowing viewing of the few acres of water, last year's cat tails, and several downed trees along the edges. 


My plan was simple with this test. Send the device out while keeping my own list from my stationary position with binoculars and ear birding only. Let the Borb stream a species list back to my cell phone and compare the two to see how well it does against my own efforts. 


I held the Borb out with my right hand and it lightly lifted off holding position in front me waiting for commands. 


Me: “Scan the wetland in this area stopping at the wooded ring, parking lot, and campsites along the far shore for all bird species possible.” 


Borb: “Command understood, beginning scan mapping.”


The Borb shot up to the expected 100 feet of elevation or so and hovered for about a minute as it presumably scanned the area and generated a map of the territory to be covered. The AI routines I had read were capable of parsing language pretty well and using machine learning models could photograph the area and figure out what I meant by wooded ring, parking lot, and campsites as constraining parameters. 


A chime hit my cell phone and I was shown a map with live images that had been stitched together of the area from the Borb. The map was overlaid with a grid indicating what it had interpreted as the borders of the zone to be scanned along with an additional grid of what it would fly as it transited the area listening and looking for species to identify. I was able to push an agreement icon for the search plan and things began in earnest. 


I then put my phone away and began birding as I normally would, easily ignoring the Borb at this point as I was locked in on my own identification tasks and honestly the device was so quiet and low flying that I couldn’t even see or hear it at that point. It was what I most expected from this location for the time of year. The ice had cleared just a week or two prior and a handful of ducks could be heard, though from my position I couldn’t see most of them, I still knew them to be Mallards. An Eastern Phoebe could be heard calling, and was a early arrival and my first of year for the species. A single Red-winged Blackbird was calling to setup territory as well on the near shore. Soon I heard a Song Sparrow belt out full song as well followed shortly after by a handful of American Crows flying over the pond causing a racket. 


I kept listening and looking for birds and eventually ticket 14 species until the chime came from my phone at the Borb had completed the effort and was returning to base. Seconds later it zoomed up over the dead cat tails and froze in front of me at arms length. I held out my right hand and the Borb settled quickly and shut down leaving all it’s weight in my hand. My phone chimed a second time indicating the command sequence and effort was fully complete. I set the Borb down in the car's front seat after shuffling back to the parking lot and subsequently found a bench to sit on while reviewing the results. I took a deep breath and opened the app for the Borb and selected the most recent engagement and then the option for review complete checklist of species and counts identified during the scanning operation. 


I stared at the list for several moments, stunned at what I was seeing. The list indicated the device as identified 27 bird species and a total of 54 total individuals during the effort. The device had nearly doubled my own efforts and though I expected more I just didn’t think it was really possible from just a single location at this time of year. I was at 14 species and 22 total individuals so I was keen on figuring out where all of the added species were in such a small space.  The list was a expandable for each species fortunately and when I opened Red-winged Blackbird it showed GPS coordinates for 3 total individuals despite my hearing and seeing just one during my own efforts. 


I further clicked on display map for the species and their GPS locations were placed on the original grid map it had generated for the effort. The one bird I had seen was clearly marked in the exact location I had first seen it minutes ago. The other two though were shown off map a few hundred feet away at another pond to the north. Though the Borb had never left the target area, it was still able to map out birds it was able to hear with incredible accuracy using audio triangulation features built into the audio array of the device. I then examined the 13 other species I hadn’t even identified on my listing effort and was surprised to find Tufted Titmouse on the list. I had spent so many years in the park (almost 20) before finding even one short term visitor that I was certain this was a device error. In fact I recalled the problems of the early versions of the Merlin app from years prior where small segments of other species calls would regularly be identified as Tufted Titmouse. As I clicked into the GPS coordinates and mapping they resolved to an even further away woodlot to the north and west. I knew it well also and it was reachable by trail at just about ½ mile.


I girded myself and grabbed a walking stick to help steady my efforts. I was resolved to make the walk even if it took me half the day. I had to see this bird to verify what was happening and what I was being shown. I had an idea though and grabbed the Borb from the front seat of the car first. After holding it out in front of me I asked it to track down the most recent Tufted Titmouse identification and keep an eye on it for me while I walked out the location. The Borb acknowledged and shot up into the sky out of sight. Roughly twenty seconds later my phone pinged with a message that the bird had been found and was being monitored. 


I shook my head, at a loss for what was happening. It was so surreal that I was now following the bird GPS pins of a non-human device. I set out at my own glacial pace that was the standard for me these days. Breathing deep I finally noticed the crisp cool air of a spring morning and savored the exhale. I didn’t know how long I’d be able to get out into these conditions, but they filled me with such joy and appreciation for nature that I hoped it would be several more years at least. Progressing slowly I turned a corn at the end of the wetland and wooded ring of the pond and saw the ¼ mile distant woodlot that promised to hold a Tufted Titmouse. It wasn’t a rare bird for the area, but again it was one that just didn’t spend much time in this particular park and was more likely to hold fast to the river valley several miles east. I checked the Borb app a few times and saw that the tracked bird had moved several hundred feet, but was still within the same wooded space and not far off the natural trail that bisected the space. The Borb app in the meantime had begun a more detailed list of the space it was now monitoring and had jumped up the bird list to now 30 species since it began with the most notable species being a Northern Flicker in the same area as the Tufted Titmouse. I estimated this to be an overwintered bird that was finally free to move about the area since the bulk majority of them had not made it back to the area yet.


I finally entered the woodlot twenty odd minutes later and nearly immediately hear the Tufted Titmouse calling it’s Peter, Peter, Peter call. I half expected a Blue Jay for the number of times I’d been trolled in the past by them masquerading as another species. Not today though as I finally laid eyes on the Titmouse after a few more minutes' walk. Even that was not a normal event though as I neared the prospective location I spoke into the app to have the Borb give me a visual indicator of the bird's location. The Borb moved positions and projected a foot shape on the ground in front of me that glowed and pulsed in a greenish light. Then one after the other like it was walking me to a viewing position. After a few dozen of these, new projections began in red up the side of a tree like an ant march of dotted lines with an arrow in the lead. Eventually the arrow froze and just pulsed and my phone issued the audio note.


Phone: “The Tufted Titmouse is about six feet beyond the final pulsing arrow.”


I picked it up quickly as the bird seemed to peck at a seed or nut it had found while foraging. I was dumbfounded how developers had created such a robust device and how it put me on a bird with such ease nearly a ½ mile away from where it all began. It was in this moment that I realized everything was now different. Birding for the remainder of my days would now be something else that I’d never experienced. I didn’t know if it would all be positive and well changes, but the implications were amazing for what could be accomplished. I had used just two features of the Borb at this point and was beyond impressed with what it could do in those use cases. Just then the app pinged and asked me a question.  


Phone: “Would you like Borb to take a photograph for documentation of the bird?


Me: “Yes, please do.”


Phone: “You can also configure Borb to take photos of all species categorized as uncommon and beyond if you like.”


Me: “Yes that’ll be fine.”


Phone: “Borb configured for automatic photos of all uncommon to rare bird species.”


Just then a photo icon showed on the display next to the entry for the Tufted Titmouse. A high resolution photo of the Titmouse was on screen once I tapped the icon showing better image quality than anything I’d taken in my long history as a bird watcher. I asked the Borb to follow me back to the car and continue scanning for new species on the way. I put in that it should hold position for any rare species, but the return trip was relatively uneventful as we walked and floated together along the trails. The only thing that peaked my curiosity was a ping about half-way back to the car when Borb noted.


Borb: “My sensors have detected a Purple Martin flying at higher altitude above our current position, but you won’t be able to see or hear the bird and it will be out of my own range in roughly. 12 seconds.”


The bird had been noted because it was perhaps a week or two early for arrival. 


Me: “Would you be able to get a photograph if you pursued the bird?”


Borb: “Yes”


Me: “Please do so as quickly as possible.”


The Borb silently jetted off and was out of sight within seconds. I looked at the phone and tapped the video icon in the upper right to show a live video feed. At first I just saw blue, but soon a black dot grew larger until I could see wings and even a light purple hue and a slightly forked tail. The Borb locked in on the bird, adjusted focus and zoom a bit and the image froze showing the photographic evidence of the bird I would never see personally. It would turn out the sighting was in the top five earliest for the area for Purple Martin. I wondered how reporting would be handled considering this was a real record, but not one a human would be capable of producing. I realized though that stationary listening stations had been listing for years so this was scarcely much different. My only moral quandary was whether I should be claiming the sighting as well or not. After all, the device was now an extension of myself and only took the actions it did based on my instructions. Wasn’t it just a new pair of binoculars or a camera with more functionality and I’m the one behind the device?


The day was just the start of the rabbit hole we all went down when the Borb entered the birding community as the first partially autonomous birding companion. When I arrived back at my vehicle I thought of the cold storage code I had cloned from the repo and the fact that it looked an awful lot like it was meant for a fully autonomous device. What if the government had made them remove that code from the devices? Did a version of reality exist where Borb would just be on it’s own and could go bird watching alone, stopping only to recharge? Would I be bird watching with a device that would act more like another person than something I owned? Would I have to ask the Borb to come birding with me?


I would get answers to these questions and more eventually. I hadn’t even scratched the surface yet.


Thursday, June 5, 2025

The Peregrine App

 Peregrine


I suppose it was only a matter of time really. Perhaps the inevitability of technology. To reach its promise won’t it just make everything we apply it to easier and easier, until we aren’t even necessary for it to accomplish the task at hand? We ceded ground year after year for decades until one would scarcely recognize the hobby. Creature comforts and educational aids built upon one another, advances in optics eventually combined to reduce the difficulties of identification to trivial matters. The hobby grew by leaps and bounds, opening it to more and more interested parties that had less and less time available to dedicate towards personal mastery. 


We never really stopped to think though what the future would hold. Each year it seemed we were given some new shiny toy to play with that would enhance and add to the experience. Each iteration improved the experience with innovations making it ever easier to be more accurate and consistent in finding, identifying, and reporting what we see and hear. 


Peregrine, it was called, as the new app to replace and eliminate what had come before. An apex predator’s name for an apex application. At first it seemed we would simply move from the app we used to send in our field reports of birds from to a new one that integrated the sound and visual identification features of others. Many of us had been using them for so long that it seemed all of it would be a constant for the remainder of our lives. Even long time hold outs eventually came on board to use the Peregrine app, its features easing problem points long yearned for. 


The most experienced and well versed of the hobby loved the idea that they could apply decades of experience to a new bird and with the simple push of a button or voice command take a sound sample, verify identity, and attach all that would be needed to the record for a watertight report. Given well geared individuals, photos could be taken in the field and automatically uploaded to the record in fractions of a second. The process, so seamless, that an embedded AI module could prompt for details to be included. It was a hyper intelligent personal field assistant that rapidly constructed uncommon and rare bird records on the spot with zero lag. Bird watcher and technology working together seamlessly. If anything could have been called the golden age of birding technology I’d have to imagine it was that moment where each seemed equal in the pursuit.


For myself that first day I felt a swell of excitement that was at first a heady mix of unrivaled power topped with childlike amazement. It was all eventually bathed in trepidation though as I slowly learned the full breadth of power sitting in my hands. Like thousands of days before it, I was up at the crack of dawn for a stationary fall migration count. The newest version of Peregrine had been on my phone for a few weeks, but it had taken me a while to get all the integrations up and running and the AI core trained to my voice and style. This would be the first day the AI would take over most of the listing duties for me. Typing had become tedious anyway with arthritis setting in the last few years and I was ready for the promised full hands free experience that had come with the update. 


It was all so surreal as I pulled up to the driveway and turned left into the nature center parking lot. Every fall for roughly 40 years I had allocated time to come and count birds heading south. The new director less than half my age had agreed to continue to allow me access to the property prior to the typical gate time for the public. I stopped to unlock and open the gate before closing it behind me and my phone spoke from the console.


P: “Would you like me to start a stationary list for this location?”


“Yes, please do, and thank you.”


A smile crept across my face realizing what had just happened. I soon pulled up to the empty lot and rolled the windows down so I could listen for birds while finishing off my breakfast sandwich. In between bites I made the motion to grab my phone and key in a Gray Catbird, but realized I was in the hands free zone now. 


“Add 1 Gray Catbird please.”

 

Met with silence I assumed I wasn’t doing something right so I grabbed the phone and unlocked the screen to look at my list, but there was the Catbird already noted. GPS coordinates were added in the record and a sound sample loaded. I checked the sample and detected nothing of the sandwich wrapper crinkling that must have initially permeated the entire recording. Whisked away by advanced filtering noted to be capable of eliminating 98% of ambient noise. 


I then fished around the features a bit and soon found voice prompt responses and saw it was unchecked. I checked it for the time being hoping to wean myself later from expecting to talk with Peregrine the whole day like it was a birding friend of yesteryear. 


As I started to gather my gear and snacks for a day of bird watching on the hilltop along the river I heard a House Wren chattering and scolding away. 


“Add 1 House Wren please.”


P: “House Wren added to your list.”


The early morning before sunrise was often pretty quiet save for these few early risers. Down along the river valley I just barely picked out a Barred Owl giving a short hoot series.


“Add 1 Barred owl please.”


P: “Are you sure?”


“Excuse me?”


P: “My audio sensors indicate the sound just made at considerable distance has an 85% chance of being canine in origin and not a Barred Owl.”


“Yes, it was a Barred Owl.”


P: “Barred Owl added to your list.”


The moment was one I chewed on for several minutes while I stood in silence thinking about what just happened. Was I now being tested by my device for every identification? Doubt crept in for a moment as I wondered if it was possible I just miss heard the sound with time dulling my once keen hearing. Not knowing the full spread of capabilities at hand I unlocked my screen and saw the entry with a confidence percentage marked at just 15% by the AI module. 


“Can you play back the Barred Owl that was just recorded please?”


P: “Playing back enhanced audio sample.”


On playback I realized the advanced microphone coupled with built-in cleanup algorithms in the app had made it possible to nearly perfectly isolate individual sound segments even those at very low volume and great distance. With the background rustling gone and my own breathing cut out it was plain as day not a Barred Owl on the recording. It was a Coyote howling in a cadence that sounded roughly similar to a Barred Owl. 


I once again sat with my thoughts on the subject. I found a mix of shame and anger, though I couldn’t really place why either existed. It seemed I should just remove the Barred Owl from my list and be happy for the correction on the identification. The technology had just done what it was promised to do, it made sure an identification was correct and when I persisted it marked the record in such a way that data scientists could look at it further if needed. 


“Please remove the Barred Owl from my list.”


P: “Barred Owl removed from current list, audio sample will be kept for 30 days by default.”


My first hour or so on the site went pretty smooth beyond the owl incident and I added birds slowly as they woke up with the sun and I awaited the first true migrant flights. It was a crisp fall morning, but comfortable as I sipped my tea and alternated between standing and sitting on the nearby bench. I verbally noted a few Common Yellowthroat making calls from the long grasses nearby along with chip notes from a Lincoln Sparrow. Unprompted though Peregrine jumped in… 


P: “Historical patterns suggest you prefer 1 hour lists, which comport well with standards for stationary bird counts. Would you like me to close out this list now, submit it, and start a new one using the same protocols?”


I was flabbergasted. Peregrine had anticipated my next actions by analyzing my past history and suggesting what to do. Stuck thinking about what had happened I didn’t respond for several seconds. 


P: “Sir?”


“Uhh, yes. Please do what you just suggested.”


P: “New list started. The prior list has been submitted with 12 species, 9 of which contain audio samples, and none with photographs. This is your 7,085th list submitted for this location lifetime and your site list currently contains 237 species.”


“Wait, what other statistics do you have for my birding at this location?”


P: “I have access to all user data for this location. If you would like to hear something specific, just ask.”


It became clear to me at that moment that something significant had changed while I wasn’t looking or paying attention. No longer would I be sitting at home and curious about my data analytics for given locations and need to fire up a spreadsheet to figure it out myself. I’d now just be talking to Peregrine to tell me what the results are. 


“What is my single day record for species identified at this location?”


P: “Your best day occurred on September 14th, 2035. On that day you observed 118 species from 4:30AM to 8:00PM.”


This was great, I’d always been a numbers guy and now I could just lob questions into this oracle of my birding data on any whim and retrieve the answers. I smirked a bit and settled in to watch and listen for birds, content that the tedious part of things was being lifted from my life. My focus could turn exclusively to the birds themselves and the enjoyment of their existence. I presumed my time spent not staring at the phone would now mean more birds seen and Peregrine would even keep me honest in the process. Even with decades of experience it seems I would benefit from having a companion help eliminate some of the mistakes we all naturally make. 


The morning got on with the sun rising quickly over the horizon and I found an opportunity to photograph a perched raptor a ways away from me. The now 300x optical zoom on my newest camera was capable of taking some pretty unreal photographs at distances that previously were impossible. I recognized the bird immediately as a juvenile Sharp-shinned Hawk and snapped a couple photos. 


P: “Would you like me to add those photos to an entry for Sharp-shinned Hawk and note the birds age?”

     

Before I realized what had just happened I answered, “yes”. 


P: “Sharp-shinned Hawk aged between 8 and 12 weeks of age added to your list, along with 2 photographs.”


I hadn’t even told Peregrine to add this species yet. I just took it’s picture and set the camera back on the bench. I grabbed my phone and looked at the app entry and saw 2 photos uploaded to the record already with a short description and appropriate checkboxes for the bird's age. The GPS coordinates pinpointed the exact tree the bird was still perched upon as it preened in the morning light. I was certainly getting the full hands free experience as it seemed every minute that went by I was learning something new about how efficient Peregrine was going to make things for my bird watching. The nature center was still not open for another hour, though I had keys to get in and use the facilities. I left my gear where it sat and strolled slowly over to the admin building. As I entered the structure I heard a small chime and didn’t think much of it at the time. I was back at my post watching and listening within five minutes. As the next hour ticked by I could hear employees and volunteers arriving for a school program. 


P: “Shall I close out this hourly list and begin another?”


“Yes, please do.”


P: “New list started.”


Out of curiosity I grabbed my phone and looked at the prior list. It was my most hands off of the morning thus far and I wanted to see how Peregrine entered multiple birds of the same species when one had photos. I looked at the Sharp-shinned Hawk and could see that it now showed the species with 2 birds in a collapsed hierarchy. When I tapped the arrow to open the entry I could see the record and details for the one I photographed along with the other entry. They were separate line items with the one verbal entry just showing my GPS location and a note the bird was seen from this position. As I collapsed the entry back down I noticed a note on the entire list entry. It indicated a time span of 55 minutes with 5 minutes of non-observation time. The 5 minutes corresponded with the time I used the restroom and was out of earshot of any birds. Was the AI this intelligent that it knew I couldn’t possibly be birding during that time so it adjusted my list?


“Peregrine on the prior list, did you amend the time birding down by 5 minutes?”


P: “Yes, my GPS data indicates you were not birding during the time period from 7:42AM to 7:47AM.”


“How did you know that was the case though?”


P: “When the cell phone GPS signal degraded slightly it was an indicator you had entered a building. Audio samples from that time period indicate that running water was present and that you were washing your hands.”


“You recorded me in the bathroom?”


P: “Audio sampling is always occurring. My algorithm detected that you were 99% to be using a bathroom during this time period so the audio samples were thrown out rather than saved.”


“Can I stop you from recording before you make such detections in the future?”


P:”Yes, you can close out the current list and not start a new one. I will cease audio recording, for your lists, during time periods when no active list is engaged.”


It quickly became clear that I was going to need to find a new balance with the technology in my hands/pocket. I had been aware that for many years smart watches were able to automatically sense hand washing activities and start timers to condition ourselves for appropriate duration and water temperature, but I hadn’t anticipated that technology creeping directly into my bird watching apps as well. The long promised total integration of technology was happening and though it was exciting I started to feel some of the pangs of concern and angst over what felt like an intrusion and a point of no return.


I watched, listened, and counted birds for 7 hours that day. All the while Peregrine dutifully recorded my observations, pulled audio samples and dropped coordinates when appropriate for each record. I would see later that the near constant stream of Blue Jays migrating south that day numbered just over 1,800 total individuals and that Peregrine had entered them in groups as separate entries each time I noted them with time stamps along with flow estimates. Despite the odd moments of realizing I was being watched and checked for accuracy I was astounded when I realized at the end of the day that I had barely even keyed in anything via the keyboard the entire day. My last encounters that day were a continued demonstration of some wonderful artificial intelligence and amazing integrations between user and devices. 


P: “Shall I close this list out and start another one for the next hour of observations?”


“No, I think I’ll wrap up for today so you can close this list and we will be done for today.”


P: “Your list has been closed out and uploaded. You bird watched a total of 7 hours and 6 minutes today. You’ve listed 65 species and 2352 individual birds during that time. Would you like me to ask Siri to start your car?”


“Yes, please.”


P: “Your car has been started by Siri and will reach preset temp of 68 degrees in 4 minutes and 35 seconds.”


I thanked Peregrine for the help that day, though I’m not sure it was necessary considering it was a machine and unlikely to care about platitudes of that sort. It felt odd though not to acknowledge the help so I resigned to treat the program as I would have a friend that had done checklists all day while birding together. As I had always done in the past I had the car drive me the long way home hoping to add a few more species to my day list. As I approached a farm I’ve long known to be a key location for Horned Larks and Eurasian-collared Doves I asked Siri to allow Peregrine to operate the vehicle in bird finding mode for now. The windows came down immediately and the vehicle slowed to a crawl.  


P: “Vehicle in bird mode and birding voice commands enabled. Would you like to start a mobile list for this segment?”


“Yes, please.”


We slowly edged along one of the last remaining dirt roads in the entire area. Up ahead I could see a couple Horned Larks on the road shoulder picking at the dirt. 


“Peregrine, stop the car.”


As the car pulled a bit to the shoulder it came to stop. I step out and grabbed my camera for a quick shot of the Horned Larks. Further ahead I saw a Dove shaped body on the power lines leading up the farms driveway. I pulled in another 300x zoom shot that made it look like I was standing right next to the bird.  


P: “I’ve added 2 horned larks plus photos along with 1 Eurasian-collared Dove and a photo to the list. Are any other individuals present?”


“Not currently.”


I recall smirking a bit to myself and thinking that Peregrine had already moved past asking if I wanted to add the photos and birds since we’d done it all day. The assumption now would be automatic adds and identifications for photos. I imagined in another few days or a week I’d turn off the notification feature I’d turned on in the morning and it would all happen in silence. 


“Continue forward at birding pace please.”


The car silently pulled back into the road and edged along the usually pretty bird laden stretch. This went on for about 10 minutes as I noted bird species along the route for Peregrine to add to the list. 


“Ok Peregrine, that is good. Please hand control back to Siri for the return trip home.”


P: “Exiting bird mode. Your list has been closed out with 14 species covering 2.6 miles in distance.” 


——


Such was life in the new AI enhanced world of bird watching. On forums around the globe birders posted new use cases for the technology and tips for how to get more out of your AI instance of Peregrine. It appeared for several months that we had ushered in a new world to our hobby and science would gain benefits in the process. It wasn’t until spring of the next year that a user of Peregrine posted to a social media group a question to the developers and project leaders of Peregrine. 


Social Media Post: “Is Peregrine submitting an extra list in addition to the list I’m submitting for any given outing? I’ve been analyzing the packet traffic from the app and it appears beyond any list, sounds, and photos being sent that an additional packaged list and set of sound files are going out as well. In some cases this additional package dwarfs the ones I presume are from my current active list.”


I remember seeing this question hang in the air for a couple days on the page that typically has same day response from the developers. At first it seemed no answer was going to be offered, but then on day three came a simple, yes, and nothing more. The flood gates opened with curious birders wondering what on earth the extra list was, with some of us knowing what it was without needing further confirmation. 


It occurred to me that during the introduction of the earliest versions of audio sampling apps years earlier that a future might exist where those that carry a cell phone are no longer “needed” for their skills only their ability to carry a phone set to actively listen to the environment. If the technology advanced enough it would in theory be a far superior solution to just let the application AI identify all bird related audio and remove the human element. I assumed at that moment that we had just passed the point of no return. The machines were now better than we were at bird identification by orders of magnitude, but we had ignored that reality while enjoying how much easier it was making our lives. In hindsight it’s perhaps not so bad really. We get an amazingly functional app with an AI module capable of making our lives much easier and science gets to piggyback off our devices and efforts while recording every bird in a dragnet style approach that would likely be far superior to our own capabilities. 


On the fifth day another poster started a new thread that said the following.


Social Media Post: “Read the terms of service everyone!”…“as such the user consents to allow AI routines to filter for ambient avian noise at time periods that do not conform to the user's typical use schedule of the encompassing application. Collections made during these time periods will be bundled for distribution back to the project database during the next available submission window.”


All the technology in the world and the terms of service is still the place to go for when you want to do something legally without directly telling someone you plan to do something. It’s all on the up and up technically, but considering the average user sees a dozen to two dozen terms of service updates a week nowadays it’s the perfect place to hide features you want to run silently in the background. Though the language was cryptic, it was still pretty obvious. The application had been recording audio all day and night while clipping out segments with bird calls to be submitted when the next list is packaged to be delivered. The immediate implications of this were pretty broad. The Peregrine AI instances had been listening for birds most hours of the day for about 6 months on several hundred thousand phones. It seemed possible that the project had just collected more bird observations in that time frame than they had gotten directly from all users in the prior decade.   


A further response on the social media thread asked the golden question that ultimately became the focal point of this entire situation. 


Social Media Response: “Wait a minute. Peregrine grinding out birds when I’m not birding is one thing, but I have a question. How much better at audio id is Peregrine than I am at this point? Like, when I’m actually birding and Peregrine is helping me build a list or correcting my identifications, doesn’t that mean Peregrine's list is likely better than my list when it’s time to submit? I don’t recall Peregrine asking me if I heard any birds that it picked up in the background unless I’m actively using the sound id feature itself that was migrated over from Merlin. Am I no longer necessary in this equation?”


The final question sat in the digital space like an elephant in the back of a room nobody wanted to acknowledge. One commenter simply stated, “We are obsolete.”


An official statement and response did eventually come from the program coordinator. I’m not really sure even today whether it was good, bad, or just a slap in the face from reality. We learned that day the AI in our pocket along with the hardware it ran on was beyond anything we had ever imagined. It was so good in fact that developers realized it was capable of things no human could ever claim to be able to detect. Thus, they took some of the capabilities off line to prevent human list inflation from getting out of hand. They still wanted the data though so they decided to start it off by hiding that data to ease the ego bruises that were inevitable. 


Official Statement From Peregrine Team: “We realize this response is overdue, but we wanted to make sure that we covered all the bases and answered as many questions as we could straight away. First we would like to offer an apology to our user base. It was not our intent to lie or mislead you in the process of getting to this point with the Peregrine app and AI core. The language found in the terms of service is of course required by our legal representation and we fully intended to tell everyone directly in the near future what the full capabilities of the Peregrine app really were. If you followed the AI response’s close enough it was always implying that it was recording audio even when you weren’t using the app, but we suspect that will come as little consolation. 


The reason we didn’t immediately share this directly upon release of the app is that quite honestly we were skeptical that it was actually going to be this good. In the event that it proved out to be successful we wanted the hard data as proof of something that likely would be very difficult for enfranchised birders to stomach. 


The reality is that Peregrine has, over the last 6 months, proven to be better at bird identification than 100% of application users during all time periods that those birders had been actively engaged in bird identification. This includes the fact that the bird watcher is able to count seen only birds that never make a noise in the wild. In fact, with any device running the newest microphone and chipset, Peregrine is currently able to hear and identify birds at distances that are physiologically impossible for humans. During testing we realized that reporting birds on screen or verbally to users they literally could never hear would create a problem. Users would for the first time be given evidence of a bird’s existence they might never be able to validate short of traveling to that bird. One might argue that would be a good thing and more birds would be found, but we ask that everyone please understand something. This wasn’t just happening once or twice during a birding session, it was happening non-stop the entire session. 


If you imagine standing in the middle of a field with acres of grassland all around and listening for bird species that are making noise, you would inevitably reach a maximum number of birds present that you are able to detect. Taking a sample size of 1 hour our best field ornithologists peaked at 25 grassland species detected in the test location, but averaged closer to 17. Peregrine during the same timeframe detected 55 species because it was not only able to record the same 17 grassland species, but was able to add 3 additional in the grassland, 20 more species in a woodland a half mile away, 10 more to the north along a stream corridor, and finally 5 additional birds in a small cattail pond roughly ¾ of a mile away from the count position. During this same time period the best ornithologist detected a total of 42 individual birds while Peregrine detected 136 individuals. 


It is not that Peregrine proved to be above average at voice identification of birds. The AI routines and audio cleaning features in place have shown to be superior in every way save for visual identification of birds that are silent. Repeated testing shows that in an environment with zero ambient noise save for those of mother nature, Peregrine is around 150% better by species count and in the neighborhood of 350% better at individual bird counts. These numbers do not account for an environment polluted with noise from highways, airplanes, wind, and other noises. In those spaces Peregrine excels in a way that is hard to imagine, often reaching species rates that are 800% above an advanced birder.


Our project and data collection faced a sobering reality. That our central mandate had reached a point where the primary value of the human involved in data collection was for visual identification and primarily as a vector for holding a recording device. In effect we have now trained a machine to be better than every human on earth at bird identification and we didn’t know what to do with that information beyond prove it in live testing. These last 6 months have created a firehose of information and if we just use the extra Peregrine AI submitted lists for that time period, while throwing out the user submitted lists, we would still have the equivalent of 10 years worth of collected data on birds.  


All of the above combined means we are at a unique crossroads for bird data collection on the world-wide scale. Our legal team has informed us just this week that the sound identification AI core has been accepted by Google for inclusion in their standard mobile device deployment stack. This means that effective next month all mobile devices on the planet will receive an update to begin recording and automatically submitting bird population data. This may be a bit of a shock, but our project and all downstream dependent projects will have more data in a single day than has ever been submitted by users in the last 30+ years combined. 


It may seem like the birding community is being abandoned and has been used, but we look at this as an achievement of everyone. Data collection will go mainstream and a normal everyday person equipped with a cell phone walking in a local park for exercise will now produce a bird list that previously depended on someone with decades of experience. Data consistency in these AI driven lists is so strong that most of our filtering technology to throw out inconsistent or inaccurate data can now be shelved and ignored. We will no longer need reviewers to engage users on flagged reports because the AI core will have handled such concerns. 


For the time being the Peregrine app will continue to be downloadable and available for use. Our team will provide more information in the near future on the direction the app and our data collection efforts will go moving forward.”


——


It was what it was, as the kids used to say. We had just learned that the AI student had been trained on the decades of data we had collected and submitted and it was now a master. The tone of the statement indicated that we should expect some final technological stroke where we would have continued access to the technology and our lives would still be for the better when it comes to personal bird watching efforts, but that we should get comfortable with the fact that we would never have access to everything and that our own data was of relatively little value to science. For all intents and purposes we were not good enough or even smart enough to handle what was being produced. Even in the field and in the moment it would be a firehose in our face of bird identifications the likes of which we couldn’t handle or comprehend. For lack of a better phrasing we would have to evolve as humans before we could take advantage of newer and better technology.