Ibrahim Ulukaya
Developer Programs Engineer

We at the Firebase office all enjoyed playing with Hanley Weng's "CoreML-in-ARKit" project. It displays 3D labels on top of images it detects in the scene. While the on-device detection provides a fast response, we wanted to build a solution that gave you the speed of the on-device model with the accuracy you can get from a cloud-based solution. Well, that's exactly what we built with our MLKit-ARKit project. Read on to find out more about how we did it!

This image takes a while to load, but it’s worth it.

How it all works

ML Kit for Firebase is a mobile SDK that enables developers to bring Google's machine learning (ML) expertise to their Android and iOS apps. It includes easy-to-use on-device and cloud-based Base APIs and also offers the ability to bring your own custom TFLite models.

ARKit is Apple's framework that combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences using either the back camera or front camera of an iOS device.

In this project we are pushing ARKit frames from the back camera into a queue. ML Kit processes these to find out the objects in that frame.

When the user taps the screen, ML Kit returns the detected label with the highest confidence. We then create a 3D bubble text and add it into the user's scene.

How ML Kit works

ML Kit makes ML easy for all mobile developers, whether you have experience in ML or are new to the space. For those with more advanced use cases, ML Kit allows you to bring your own TFLite models, but for more common use cases, you can implement one of the easy-to-use Base APIs. These APIs cover use cases such as text recognition, image labeling, face detection and more. We'll be using image labeling in our example.

Base APIs are available in two flavors: On-device and cloud-based. The on-device APIs are free to use and run locally, while the cloud-based ones provide higher accuracy and more precise responses. Cloud-based Vision APIs are free for the first 1000/API calls and paid after that. They provide the power of full-sized models from Google's Cloud Vision APIs.

Hybrid approach

We are using the ML Kit on-device image labeling API to get a live feed of results while keeping our frame rate steady at 60fps. When the user taps the screen we fire up an async call to the Cloud image labeling API with the current image. When we get a response from this higher accuracy model, we update the 3D label on the fly. So while we are continuously running the on-device API and using its result as the initial source of information, the higher accuracy Cloud API is called on-demand and its results replaces on-device label eventually.

Which result to show?

While the on-device API is real-time with all the processing happening locally, the Cloud Vision API makes a network request to the Google Cloud backend, leveraging a larger, higher accuracy model. Once the response arrives, we replace the label provided by the on-device API with the result from Cloud Vision API.

Try it yourself!

1. Clone the project

$ git clone https://github.com/FirebaseExtended/MLKit-ARKit.git

2. Install the pods and open the .xcworkspace file to see the project in Xcode.

  1. $ cd MLKit-ARKit
  2. $ pod install --repo-update
  3. $ open MLKit-ARKit.xcworkspace

3. To set up the Firebase ML Kit in the sample app:

  1. Follow these instructions for adding Firebase to your app.
  2. Make sure to specify "com.google.firebaseextended.MLKit-ARKit" as the iOS project bundle ID.
  3. Download the GoogleService-Info.plist file generated as part of adding Firebase to your app.
  4. In Xcode, add the GoogleService-Info.plist file to your app, next to Info.plist.

At this point, the app should work using the on-device recognition.

4. (Optional) To set up Cloud Vision API in the sample app:

  1. Switch your Firebase project to the Blaze plan
    Only Blaze-level projects can use the Cloud Vision APIs. Follow these steps to switch your project to the Blaze plan and enable pay-as-you-go billing.
    1. Open your project in the Firebase console.
    2. Click on the MODIFY link in the lower left corner next to the currently selected Spark plan.
    3. Select the Blaze plan and follow the instructions in the Firebase Console to add a billing account.

      ★ The cloud label detection feature is still free for first 1000 uses per month. Click here to see additional pricing details.

  • Go to the ML Kit section of the Firebase console and enable the "Cloud Based APIs" toggle at the top.
  • At this point, the app should update labels with more precise results from the Cloud Vision API.

    David East
    Developer Advocate

    Firebase launched over six and a half years ago as a database, but since then we've grown into a platform of eighteen (18!!) products. And over the last year we've announced a number of new features to help you build better apps and grow your business. We also infused Firebase with more machine learning super-power, so you can make your apps smarter, and matured the platform, so Firebase works better for developers at large, sophisticated enterprises.

    Since the end of the year is a great time for top-ten lists, we were going to cap off the year with our own "Top Ten List of Firebase launches." But, then, we realized we had more than ten launches we wanted to talk about, and we really don't like playing favorites. So instead, here's our "Thirteen Firebase Launches In No Particular Order Because They're All Great In Their Own Way" list for 2018. Enjoy!

    13. ML Kit democratizes machine learning

    At Google I/O, we launched one of our most exciting features of 2018: ML Kit for Firebase, a machine learning SDK for Android and iOS. ML Kit lets you add the power of machine learning to your app, without needing an advanced degree in neutral networks. It provides a number of out-of-the-box solutions for performing tasks like recognizing text in images, labeling objects in photos, or detecting faces. And it will also let you use custom models, for those of you who are into building your own. (Bespoke artisanal neural networks are big among hipster data scientists these days.)

    12. In-App Messaging helps with customer engagement

    Notifications are a great way to get latent users back into your app, but how do you communicate with active users who are actively using your app? In 2018, we launched Firebase In-App Messaging to help you send targeted and contextual message to users who are actively using your app. In-app messages are a great way to encourage app exploration and discovery, and guide users towards discovering new features in your product, or working their way towards that important conversion event.

    11. New REST APIs make task automation easier

    At Firebase, we're big fans of building scripts to make our lives easier; whether that's to automate common tasks, or to perform custom logic. To help with that goal, we launched three new REST APIs that you can use to automate your life (at least from a Firebase perspective). The Firebase Management API is great for automating tasks like creating new projects, the Remote Config REST API can be useful for customizing the way you update Remote Config values, and the Firebase Hosting API can be used to automatically upload certain files to your site.

    Recently, StackBlitz and Glitch used the Management API to build integrations that allow you to deploy projects directly to Firebase Hosting. Start a project, write some code, click a few buttons, and voila! You've deployed your Firebase project to the web!

    10. Performance Monitoring graduates to general availability

    Good performance is one of the key factors for creating a great user experience. Firebase Performance Monitoring automatically collects performance metrics where it matters the most: the real world.

    This year, Performance Monitoring graduated from beta into general availability. Along the way, we added helpful new features like an issue feed in the dashboard to highlight important performance problems your users are encountering. We've also added session view support for network class and traces, which lets you dig deeper into an individual session of a trace, so you see attributes and events that happened leading up to a performance issue.

    9. Predictions also graduates to general availability!

    We also released Firebase Predictions into GA. Predictions uses machine learning to intelligently segment users based on their predicted future behavior. Along the way, we added health indicators and evaluation criteria to every prediction, so you can better understand how reliable a prediction is, as well as the data being used to make it. We also integrated Predictions with BigQuery, so you have more control over your data.

    Getting started with Predictions is as easy as flipping a switch in the console. We predict you're going to love it! (Sorry.)

    8. Cloud Functions graduates to general availability! Everybody goes GA!

    The general availability party keeps on going! Cloud Functions hit GA and we also released a new version of the SDK. The new SDK adds "callable" functions that make it much easier to call server functions from the client, especially if your function requires authentication.

    Cloud Functions also released a brand new library, firebase-functions-test, to simplify unit testing functions. This library takes care of the necessary setup and teardown, allowing easy mocking of test data. So in addition to simple standalone tests, you can now write tests that interact with a development Firebase project and observe the success of actions like database writes.

    7. Test Lab gets testier! (But in a good way)

    Firebase Test Lab went cross-platform in 2018 by adding support for iOS. Now you can write and run tests on real iOS devices running in our data centers. Test Lab supports ten models of iPhones and iPads running seven different versions of iOS, including iOS 12.

    Test Lab also launched a number of improvements to Robo, a tool which runs fully automated tests on Android devices. Testing games is now easier, thanks to 'monkey actions' (which can randomly click on your screen), and game loops (which perform pre-scripted actions). You can also customize Robo better now, in case you need to sign-in at the start of your app or add intelligent text to a search field.

    6. New emulators for Firestore and Realtime Database make testing easier

    Continuing the theme of testing, in 2018, we launched emulators for Firestore and the Realtime Database, so you can more easily unit test your security rules and incorporate them into a continuous integration environment. These emulators run locally and allow you to test your security rules offline so you can be confident before deploying to production. We also created a testing library that simplifies your test code.

    5. Stackdriver integrations enable better logging and monitoring

    From the beginning, Cloud Functions has tightly integrated important usage metrics with Stackdriver, Google Cloud's powerful monitoring service. To deepen our integration further, we linked the Realtime Database with Stackdriver. You can now see even more metrics than the Firebase console provides, such as load broken down by operation type and information about your downloaded bytes.

    The real power of this integration is to set up alerts on metrics or errors so you can detect and respond to issues before your customers notice them.

    4. BigQuery integrations give you more control of your data

    Sometimes the reporting dashboards in the Firebase console don't give you the level of granularity or specific data slice that you need. That's where BigQuery - Google Cloud's data warehouse - and Data Studio - Google Cloud's data visualization tool - come into play.

    We've given you the ability to export your Analytics data to BigQuery for a while now. This year, we added integrations with Predictions and Crashlytics, so you can export even more of your Firebase data into one central warehouse. Learn more about using Firebase and BigQuery together here.

    3. Cloud Firestore works better for sophisticated enterprises

    Cloud Firestore is our next generation database with many of the features you've come to love from the Realtime Database, combined with the scale and sophistication of the Google Cloud Platform. Over the course of 2018, we've launched a number of improvements to Firestore, to make it better suited for complex enterprises.

    We also added some nice features along the way -- we expanded offline support for the web SDK from one browser tab to multiple. We've added better support for searching documents by the contents of their arrays. And we added multiple new locations where you can store your Firestore data: Frankfurt, Germany and South Carolina, USA. (We'll be adding even more locations in 2019.)

    2. The Firebase console becomes even easier to use

    The Firebase console is a crucial part of the Firebase workflow for just about any team. We spent a lot of time in 2018 making the console better than ever. Here's a few things we added:

    • Security Rules simulator for Cloud Firestore
    • Redesigned notifications dashboard
    • Version history for Security Rules
    • Remote Config change history
    • Filtering and sorting of Firestore documents
    • Code-completion in the Security Rules editor
    • Performance insights to surface issues across your app

    These features make you more productive and confident in your app's security and performance. We can't wait to add more to the console in 2019!

    1. Firebase support arrives in Google Cloud

    For a while now, we've been hearing from some of you that you'd like an option to get enterprise-grade support for Firebase. To address that request, we added support for Firebase to our Google Cloud Platform (GCP) support packages, available in beta right now.

    If you already have a paid GCP support package, our beta will let you get your Firebase questions answered through the GCP support channel - at no additional charge. When this new support graduates to general availability, it will include target response times, technical account management (for enterprise tier), and more. You can learn more about GCP support here.

    If you're planning to stick with Firebase's free support, don't worry - we don't plan to change anything about our existing support model. Please continue to reach out to our friendly support team for help as needed!

    Happy New Year!

    It's been a great year, so we're going to take a little time with friends and family before we hit the ground running in January. However you celebrate the end of your year, we hope your December is full of happiness and relaxation. And if it happens to be full of building mobile or web apps, we hope you use Firebase! Happy building!

    Todd Kerpelman
    Developer Advocate

    Hi, there, Firebase developers! We wanted to let you know about some important changes coming your way from Google Analytics for Firebase that will affect how we help you measure user engagement and sessions. This might also affect any BigQuery queries you might have written, so let's get right into the changes, shall we?

    What's changing with sessions?

    Up until now, sessions were measured using the following formula:

    • Google Analytics for Firebase would trigger a session_start event if there was no current session, and the app was in the foreground for more than 10 seconds
    • A session would be considered completed when more than 30 minutes had passed since the app was in the foreground
      • This meant that if a user used your app for a little while, briefly switched to another app to respond to a chat message, then switched back to your app, that would still count as one session.
      • Both of these time values could be configured locally on the client
      • If you wanted to group events by session in BigQuery, you'd essentially need to do that manually. That is, you'd need to select all events for the same pseudo_user_id that occurred 10 seconds before a session_started event, and keep going until you hit a 30 minute gap. As you might expect, grouping events by session was a not-very-fun experience for BigQuery developers.

    With the latest version of the Firebase SDK, we're going to be changing how a session is measured. Specifically:

    • Google Analytics for Firebase will trigger a session_start event as soon as your app goes into the foreground now. There's no more 10 second delay.
    • Like before, a session is considered finished when more than 30 minutes has passed since your app was in the foreground…
      • ...except that you can now add an extend_session parameter to any event which tells Analytics that, even if this event is triggered in the background, this event is considered part of an active session. This is useful if you have an app that people frequently use in the background, like a music or navigation app.
      • We will now add new properties to nearly every event that let you know what session they were in. Specifically, you'll now have a ga_session_id parameter which is a unique identifier for the session, and a monotonically increasing ga_session_number parameter to help you count the number of sessions for this user.

    So, what do all of these changes mean?

    In the Firebase console, the biggest change you'll notice is that your app will have more sessions, because we'll be counting instances where users interact with your app for less than ten seconds. This also means that any kind of "average some_event per session" stat will decrease, since the number of sessions is going up.

    On the BigQuery side of things, these new event parameters will make your life a whole lot easier. Analyzing anything by session should be really straightforward now -- you just need to group them by ga_session_id. So calculating your own "average xxx per session" values will be a lot easier in BigQuery.

    For example, here's a query where we calculate how many level_complete_quickplay events an average user generates per session:

    SELECT AVG(total_quickplays) as average_quickplays_per_session FROM (
      SELECT COUNT(event_name) as total_quickplays,
        (SELECT value.string_value FROM UNNEST (event_params) WHERE key = 
          "ga_session_id") as session_id
        FROM `firebase-public-project.analytics_153293282.events_xxxxxxxx` 
      WHERE event_name = "level_complete_quickplay"
      GROUP BY session_id
      HAVING session_id IS NOT NULL
    )
    

    And if you want to figure out, say, how many sessions it typically takes before somebody makes a purchase, you can do that by analyzing the ga_session_number parameter.

    What's changing with user engagement?

    In the past, Firebase measured total user engagement by recording the amount of time your user spent with the app in the foreground and then sending down those values (as incremental measurements) as user_engagement events. You could then calculate the total amount of time a user spent within your app by adding up the values of the engagement_time_msec parameter that were sent with each of these events.

    These user_engagement events were typically sent when a user a) Sent your app into the background, b) Switched screens, c) Crashed, or d) Used your app for an hour. As a result, it was very common to see user_engagement events sent alongside events like app_exception or screen_view events. To the point where we asked ourselves, "Why are we sending down all these extra events? Why not just send engagement time as a parameter with these other events we're already generating?"

    And so that's exactly what we're going to do, starting in early 2019. You will still occasionally see separate user_engagement events, but you will also start seeing engagement_time_msec parameters added to other events automatically generated by Google Analytics for Firebase. We're going to start with screen_view, first_open and app_exception events, but you might see them added to other events in the future.

    What do these changes mean to you?

    On the Firebase console, nothing should change. Your app might end up using a little less data, since you're no longer sending down so many separate user_engagement events, but otherwise, nothing else should look different.

    On the BigQuery side of things, you'll need to alter your queries slightly if you were calculating important metrics by filtering for user_engagement events. If you were, you'll need to alter those queries by looking for events that contain an engagement_time_msec parameter.

    For example, here's a query that calculates the total user_engagement time for each user by summing up the engagement_time_msec parameter for user_engagement events. This might work today, but it will be inaccurate in the future.

    SELECT SUM(engagement_time) AS total_user_engagement 
    FROM (
      SELECT user_pseudo_id, 
        (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 
          "engagement_time_msec") AS engagement_time
      FROM `firebase-public-project.analytics_153293282.events_20181003` 
      WHERE event_name = "user_engagement"
    ) 
    GROUP BY user_pseudo_id
    

    So here's that same query, modified to look for all events that might have a engagement_time_msec parameter

    SELECT SUM(engagement_time) AS total_user_engagement 
    FROM (
      SELECT user_pseudo_id, 
        (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 
          "engagement_time_msec") AS engagement_time
      FROM `firebase-public-project.analytics_153293282.events_20181003` 
    )
    WHERE engagement_time > 0
    GROUP BY user_pseudo_id
    

    The nice thing about that second query is that it works both with the old way of measuring user engagement and the new one, so you can modify your BigQuery queries today, and everything will still work just fine when the new changes go into effect.

    Update: Well, it took a little longer than planned, but this feature launched in April of 2020. If you've been using this second BigQuery query all along, then congratulations! Everything should continue working as before. If not, well, there's no better time to switch over.

    We hope that these changes make your life a little easier in the long run, and offer only a minimal amount of disruption in the short term. In the meantime, if you have any questions, feel free to reach out on StackOverflow, or any of our official support forums.

    Happy analyzing!

    Shobhit Chugh
    Product Manager
    Todd Burner
    Developer Advocate

    As we build Crashlytics and talk to our developers, we've found that the way they use our dashboards is often nuanced and specific to their team. We've done our best to incorporate the themes we hear most often into the dashboard you see in the Firebase console, but one dashboard solution simply isn't enough.

    That's why we launched the Crashlytics integration with BigQuery, giving you the freedom to deeply explore your data. And, using Data Studio (a free tool that sits on top of BigQuery), you can make custom dashboards from your Crashlytics data that fit the unique way your team works. Data Studio allows your team members who aren't comfortable with SQL to easily work with the BigQuery data set. Data Studio dashboards are also easy to collaborate on and share, so your team can work more efficiently.

    Today, we're launching a Data Studio template that gives you a preview of what's possible with Crashlytics and BigQuery. Let's take a closer look at the template.

    Summary of what matters to you

    The overview section of our template provides an overview of which OS versions crash the most, which devices crash the most, and how it's trending over time. You can customize each section to display the results of the exact queries you want and display how you need based on your business logic. If you want to keep an eye on the deprecation of an old operating system you can change or filter directly in the queries that back the dashboard.

    Understand trends using custom keys

    Up until now, exploring your crash reports by custom metadata like Experiment ID or an Analytics breadcrumb has been limited, making it tough to identify which variant in an experiment is least stable or which level in a game has the most crashes. Now, when you export your data to BigQuery, it's easy to run any deep analysis you want, and then visualize your report with Data Studio or any other business system you use.

    As an example, say that you set up your Android game so that you log what level a crash occurs with:

    Crashlytics.setInt("current_level", 3);
    

    Now you can filter by the presence of a key and its values. We've created a sample dashboard for filtering these in our Data Studio template.

    Know which areas of code are most impacted

    We know our largest apps have different teams that specialize in specific areas of the code. For that, we've made it easy to filter by specific files in our Data Studio template.

    Owning your experience

    Our data studio template is totally customizable, meaning if you'd rather filter on a different part of our scheme or make more advanced tools, you can easily adapt based on your needs. You can adjust the template using the Data Studio UI or you can edit the backing BigQuery queries.

    Your team can all work together by sharing the dashboard in DataStudio. This means team members don't need to learn SQL to get the benefits of the Crashlytics integration with BigQuery.

    You'll also be able to select date ranges longer than 90 days, if you set up retention in BigQuery. The Crashlytics dashboard currently retains data for 90 days. With BigQuery you own the retention and deletion policies, making it much simpler for your team to track year-over-year trends in stability data. This means you'll be able to customize your dashboard to display data over the exact period you are interested in.

    Get started today

    With just a few clicks from the Firebase Crashlytics dashboard you can enable daily exports of all raw crash data on a per-app or per-project basis. This includes your stack traces, logs, keys and any other crash data. You can also use the new BigQuery sandbox to get started for free.

    Once you link to Crashlytics to BigQuery, follow these instructions to connect this template with your Crashlytics dataset.

    If you are a current Fabric user, you can gain access to BigQuery export and all the other features of Firebase by linking your app in the Fabric dashboard. Check out this link for details and documentation.

    We hope this improvement makes it even easier for you to dig into your crash reporting data and efficiently debug your app! As always, if you have any questions, you can find us on Twitter (@firebase) and on Stack Overflow. Happy debugging!

    Emin Israfil
    Co-founder, Rubbish
    Elena Guberman
    Co-founder, Rubbish

    No one likes litter, so why do we live with it? Litter directly impacts a community's health, safety, and economic potential. At Rubbish, we believe people should love where they live. That's why we created the Rubbish app, which empowers neighborhoods to tackle litter at the local level by photographing and reporting litter, sharing and analyzing the data, and engaging community partners to clean up together. Our mission is to build stronger, healthier communities with less trash, more beautiful streets, and happier residents and we can't do it without Firebase.

    Here's a quick video of how Rubbish works

    From Concept to Launch with Firebase

    The concept for Rubbish resulted from a moment of panic and frustration: while we (Elena and Emin, co-founders of Rubbish) were walking the streets of New York City with Elena's dog Larsen, he choked on a chicken bone. Luckily, he was ok, but the two of us were not. Why was litter an unfortunate part of city living, with no effective solution to address it?

    This is Larsen. He's a good boy.

    We decided to tackle this issue and find an innovative solution together. We started to document litter daily, taking pictures and noting problem areas in our communities, which quickly accumulated into thousands of photos sitting in a stagnant shared album. We needed a better way to store and organize the information we were collecting so we could use it to make a difference. We also needed a way to share the photos and their metadata with several audiences (governments, community partners) and on several social media channels through our app. Each platform had its own set of requirements and specifications, and the idea of creating the infrastructure to accomplish this was daunting, until we discovered Firebase.

    Challenge #1 - Gather, Process and Share Data in a Seamless Way

    To combat the litter problem and make real change, we needed a quick, seamless way to gather, process, and share all the information surrounding each documented piece of trash.

    We evaluated lots of options, but Firebase stood out because it provided a comprehensive set of tools that allowed us to quickly build the backend infrastructure of the Rubbish app and address the challenges of storage, data validation, processing, and distribution.

    For example, we faced the challenge of quickly storing and tracking user-generated photos. Cloud Storage and Firestore allow us to keep track of what is being reported and where. Another challenge was verifying user submissions, especially ones requiring priority attention from third parties, like reports that need local agency involvement. With the help of Cloud Functions for Firebase, we set up a dashboard to summarize the data and generate reports in one place. We also instrumented Cloud Functions to act as a safety net and help us with quality control. For instance, before reports are automatically formatted and sent to local government agencies like San Francisco 311 for follow-up, the functions check that the submissions came from validated users with good track records, and are in the correct vicinity of the agency. We use Cloud Functions to trigger a validation review via our backend and via email whenever a photo is uploaded. Then, a member of our team evaluates the uploaded image to make sure it's clear and relevant. This makes an otherwise complicated process easy and automated.

    Additionally, we use Firebase Authentication and Security Rules to ensure that only the intended information gets shared, and to protect each user's privacy and security. Firebase allows us to seamlessly integrate our data with APIs from local governments, social networks, and our own app in a few lines of code. With Firebase, Rubbish can effectively store, share, and process the data to create real insights and impact. In addition to Firebase, we also use some of Google Cloud Platform's APIs, such as the Google Sheets API, Maps SDK for iOS, Places API, Geocoding API, and Cloud Runtime Configuration API.

    Firebase-powered dashboard that allows us to manage user submissions.

    This is one of our dashboards for tracking neighborhood trends.

    Challenge #2 - Reducing Onboarding Time for New Teammates

    As we grew our software development team, we were concerned about the time and resources it takes a new team member to get up to speed and become productive. Firebase provided easy onboarding of new members with user-friendly training resources, like robust sample projects, fun developer videos, straightforward technical documentation, and more. In fact, our new engineers are onboarded and ready to contribute three times faster, saving us significant time and resources that can now be focused elsewhere. We reduced development time on new features, as well as the time needed for maintenance, security handling, and developer onboarding, which maximizes our productivity.

    In short, Firebase enables start-up teams like us to communicate effectively, share information, and grow. It's a huge value for us that Firebase allowed us to effectively engage such a variety of talented, passionate individuals.


    Our team and their favorite Firebase product or their favorite snack.

    Growing with Firebase

    Since Firebase covers the backend infrastructure behind the app and facilitates collaboration on our team, we can focus on expanding our field testing and cultivating relationships with important partners. We launched a pilot program on San Francisco's Polk Street in August 2018, working with the community to sponsor resident-led street cleanings. We use the data we collect to inform local sponsors and residents about the progress, including summaries of the number and types of trash collected - all that wouldn't be possible without Firebase.

    We've also been collaborating with the San Francisco Community Benefit Districts and the local San Francisco government to optimize and track improvements through Rubbish. For example, we pinpointed the largest source of cigarette butts (customers at bars and restaurants) and worked with these businesses to install cigarette receptacles. We're excited to find even deeper trends and new ways to analyze and address the litter problem.

    Solving Unsolvable Problems

    As Rubbish continues to map and track litter, we are finding that trash patterns on the street can be as dynamic as traffic patterns. Local events, the weather, and time of day all play a role in determining what your street will look like when you step out for your morning walk. The data we collect is providing insight into important trends like these and is being used to help local communities sponsor and track clean-up efforts in a meaningful way. By relying on Firebase to store, process, and analyze an increasing amount of data, we feel confident that we can engage and empower individuals, communities, and governments to tackle extensive, seemingly unsolvable problems like litter.