All posts by Brendan Buff

Fundamentals of Data Science and Visualization

Virtual Training

November 9 – 19, 2020

Classes: Nov 9, 10, 16, 17, 19 from 2:00-4:00 pm Eastern

Office Hours: Nov 9, 10, 16, 17, 19 from 4:00-5:00 pm Eastern

Download Agenda                                                           Online Registration

Data analysts can use a variety of methods and tools to accomplish their goals. With a deeper understanding of data visualization software packages, your organization can produce more intuitive data visualizations in less time and identify the best software solutions to optimize your team’s workflows.

In this course, we will review best practices in data visualization design and use cases for Excel, Tableau, and R (programming language).

Learn how to clean and format data in Excel, create interactive dashboards in Tableau, and clean and visualize data in R. This course will help participants identify use-cases for each software package that maximize impact with minimal effort, expanding participants’ toolbox as an analyst.

Join us to learn about how your organization can better leverage data visualization software!

Meet Your Instructor:

Lee Winkler joined the Center for Regional Economic Competitiveness (CREC) in 2018 after graduating with a Master’s in Public Policy from the George Washington University. He currently supports projects analyzing state-level certification and license attainment and the prevalence of educational and workforce credentials. Lee regularly uses Tableau to clean data, mine insights and create interactive visualizations and is excited to help the class find how Tableau can add value to their workflow.

Registration:
APDU Members: $390
Non-Members: $715

Looking Back on the 2020 APDU Annual Conference

 

With the 2020 APDU Annual Conference in the rearview mirror, now is a good time to reflect on the week and look ahead to what’s next.

This year’s conference, as so many things in 2020, was disrupted but not diminished. While we didn’t have the opportunity to meet with each other in person, the virtual format enabled some of our friends from around the country to participate who might not have been able to otherwise.

Speakers like danah boyd of Microsoft Research and Data & Society Research Institute (excerpted above) brought a unique perspective to the conference, challenging our thinking about from issues ranging from how we approach issues of privacy and accuracy to the impacts misinformation and data voids can have on our understanding of data quality and reliability.

Federal agency leaders such as Deborah Stempowski, Brian Moyer, Bill Beach, and Mary Bohman provided insider insights into their organizations.

Speakers from universities and research organizations across the country covered hot topics such as data on COVID-19, evictions, policing, and more.

Speakers from the Census Bureau, universities, and nonprofits discussed how the Disclosure Avoidance System will affect the quality of Census data.

Attendees met with APDU board members in a series of town hall conversations on a variety of topics – offering a promising way for APDU members to connect with one another.

This year’s conference was a success for a variety of reasons – but the biggest reason was the engagement of our attendees and speakers. Stay tuned for continued quality programming in Fall 2020!

Intermediate Data Visualization Techniques in Tableau

August 25-September 3, 2020

Virtual Training

AGENDA

A picture is worth a thousand words. Use data to state your case using easy-to-understand data visualization tools. Give your audience the freedom to adapt your data in new ways in interactive dashboards that answer immediate questions and uncovers new insights. Data visualization tools can help you communicate better both internally and with your partners.

Tableau can help you produce more intuitive data visualizations, and we can show you how. In this course, you will build your skills in making appropriate graphics, but you will also incorporate complex calculations in ways that improve insights, make charts more relevant, and create the most impactful dashboard graphics.

Learn how to clean, shape, aggregate, and merge frequently used public data in Tableau Prep. Then, organize your visualizations into sleek dashboards in Tableau Desktop. We will provide helpful tips on how to analyze, design, and communicate these data in ways that will wow your supervisor and organization’s customers.

Training Prerequisites:

Skills: Participants must have a basic understanding of how Tableau works before attending this class, including knowledge of Tableau terminology, uploading data, editing data sources, and creating basic charts. Attendees should be familiar with all materials presented in the Pre-Session Videos: Overview of Charts and Calculated
Fields.
Tools: Laptop, wired mouse, Tableau Desktop (personal, professional, or public version), and Tableau Prep.
• Public version of the Tableau desktop is available at:
https://public.tableau.com/s/download
• Tableau Prep Software can be downloaded here:
https://www.tableau.com/products/prep/download

**Zoom will be required for this training – if you have Zoom restrictions for a work laptop, we recommend using a personal laptop or desktop. We do not recommend using an iPad for this training.
Pricing
APDU, C2ER, LMI Institute Premium Organizational Members $ 495
APDU, C2ER, LMI Institute Individual & Organizational Members $ 575
Non-Members $ 715

CANCELLATION POLICY: APDU must confirm cancellation before 5:00 PM (Eastern Standard Time) on August 14, 2020, after which a $135 cancellation fee will apply. Substitute registrations will be accepted.

APDU Member Blog Post: It’s not too late to rebuild data-user trust in Census 2020 data products

By: Jan Vink, Cornell Program on Applied Demographics
Vice chair of the Federal State Cooperative on Population Estimates Steering Committee
Twitter: @JanVink18
Opinions are my own

The Census Bureau is rethinking the way it will produce the data published from the Census 2020. They argue that the old way is not good enough anymore in this day and age because with enough computer power someone could learn too many details about the respondents.

There are two separate but related aspects to this rethinking:

  1. The table shells: what tabulations to publish and what not to publish
  2. Disclosure Avoidance Systems (DAS) that add noise to the data before creating these tables

Both aspects have huge consequences for data users. A good place to start reading about this rethinking is the 2020 Census Data Products pages at the Census Bureau.

The Census Bureau is aware that there will be this impact and has asked the data-user community for input in the decision process along the way. There were Federal Register Notices asking for use cases related to the 2010 tables, an ask for feedback on a proposed set of tables. There were publications of application of a DAS to 1940 Census data, 2018 PL94-171 data from the 2018 test and the 2010 Demonstration Products. Currently the Census Bureau is asking for feedback on the measurement of progress of the DAS implementation they plan to use for the first set of products coming out of the Census.

The intentions of stakeholder involvement were good BUT didn’t lead to buy-in from those stakeholders and many are afraid that the quantity and quality of the published data will severely impact the capability to make sound decisions and do sound research based on Census 2020 and products that are directly or indirectly based on that data. Adding to this anxiety is the very difficult unexpected circumstances the Census Bureau has to deal with while collecting the data.

From my perspective as one of those stakeholders that is wary about the quantity and quality of the data there are a few things that could have gone better:

  • The need for rethinking is not communicated clearly. For example, I cannot find a Census Bureau publication that plainly describe the re-identification process, all I can find are a few slides in a presentation. A layman’s explanation of the legal underpinning would be helpful as well as some argue that there has been a drastic reinterpretation.
  • The asks for feedback were all very complicated, time consuming and reached only a small group of very dedicated data users that felt tasked to respond for many and stick with the low hanging fruits.
  • It is not clear what the Census Bureau did with the responses.
  • The quality of the 2010 Demonstration Products was very low and would have severely impacted my use of the data and many others uses.
  • Most Census Bureau communications about this rethinking consisted of a mention of a trade-off between privacy and accuracy followed by a slew of arguments about the importance of privacy and hardly any mention how important accuracy is for the mission of the Census Bureau. Many stakeholders walked away with the feeling that the Bureau feels responsibility for privacy protection, but not as much for accuracy.

There is a hard deadline for the production of the PL94-171 data, although Congress has the power to extend that date because of the Covid-19 pandemic. Working back from that, I am afraid that decision time is not too far away. The Census Bureau is developing the DAS using an agile system with about 8 weeks between ‘sprints’. The Bureau published updated metrics from sprint II at the end of May, but already started with sprint IV at that time. If we keep the 8 weeks between sprints this implies in my estimation that there is room on the schedule for 2 or 3 more sprints and very little time to rebuild trust from the data-user community.

Examples of actions that would help rebuilding some trust are:

  • Appointing someone that is responsible for the stakeholder interaction. So far, my impression is that there is no big picture communication plan and two-way communication depends too much on who you happen to know within the Census Bureau. Otherwise the communication is impersonal and slow and often without a possibility for back-and-forth. This person should also have the seniority to fast-trac the publication review process so stakeholders are not constantly 2 steps behind.
  • Plan B. A chart often presented to us is a line that shows the trade-off between privacy and accuracy. The exact location of that line depends on the privacy budget and the implementation of the DAS and the Census Bureau seems to have the position that they can implement a DAS with a sweet spot between accuracy and privacy that would be an acceptable compromise. But what if there is no differential privacy based DAS implementation (yet?) that can satisfy a minimal required accuracy and a maximal allowed disclosure risk simultaneous? So far it is an unproven technique for such a complex application. It would be good to hear that the Census Bureau has a plan B and a set of criteria that would lead to a decision to go with plan B.
  • Promise another set of 2010 data similar to the 2010 demonstration products so data users can re-evaluate the implications of the DAS. This should be done in a time frame that allows for tweaks to the DAS. Results of these evaluations could be part of the decision whether to move to plan B.
  • Have a public quality assurance plan. The mission of the Census Bureau is to be the publisher of quality data, but I could not find anything on the Census Bureau website that indicates what is meant with data quality and what quality standards are used. Neither could I find who in the Census Bureau oversees and is responsible for data quality. For example: does the Bureau see accuracy and fitness for use as the same concepts? Others disagree. And what about consistency? Can inconsistent census data still be of high quality? Being open about data quality and have a clear set of quality standards would help showing that quality is of similar priority as privacy.
  • Publish a time line, with goals and decision points.
  • Feedback on the feedback: what did the Bureau do with the feedback? What criteria were used to implementing some feedback but not others?

Time is short and stakes are high, but I think there are still openings to regain trust of the data community and have Census data products that will be of provable high quality and protects the privacy of the respondents at the same time.

 

 

 

APDU Launches 2020 Annual Conference

APDU is opening registration for the 2020 Annual Conference, set to be held at the Key Bridge Marriott in Arlington, VA on July 29-30, 2020. Trending issues in the world of data – issues of privacy, accuracy, and access – are profoundly changing how we think about the collection, production, sharing, and use of data. Register for the APDU Annual Conference today to learn how the coronavirus is impacting public data and evidence-based policymaking. Attendees will also hear about outcomes from the decennial census and the privacy and public health issues that are impacting it in 2020.

We recognize the tentative nature of in-person events in these uncertain times, but we will continue to plan for the conference with the hopes of a return to normal. We are evaluating plans for a hybrid virtual conference to ensure that the conference will be delivered either live or online. Please know that cancellation fees will not apply to those who register early. In the case the conference cannot be held in-person it will transition to online, your registration will automatically transfer to the online content. If you don’t wish to attend online, we will provide a full refund on request. We will monitor these issues closely and be responsive to our members and partners.

APDU Member Post: Assessing the Use of Differential Privacy for the 2020 Census: Summary of What We Learned from the CNSTAT Workshop

By:

Joseph Hotz, Duke University

Joseph Salvo, New York City Department of City Planning

Background

The mission of the Census Bureau is to provide data that can be used to draw a picture of the nation, from the smallest towns and villages to the neighborhoods of the largest cities. Advances in computer science, better record linkage technology, and the proliferation of large public data sets have increased the risk of disclosing information about individuals in the census.

To assess these threats, the Census Bureau conducted a simulated attack, reconstructing person-level records from published 2010 Census tabulations using its previous Disclosure Avoidance System (DAS) that was based in large part on swapping data records across households and localities. When combined with information in commercial and publicly available databases, these reconstructed data suggested that 18 percent of the U.S. population could be identified with a high level of certainty. The Census Bureau concluded that, if adopted for 2020, the 2010 confidentiality measures would lead to a high risk of disclosing individual responses violating Title 13 of the U.S. Code, the law that prohibits such disclosures.

Thus, the Census Bureau was compelled to devise new methods to protect individual responses from disclosure. Nonetheless, such efforts – however well-intentioned – may pose a threat to the content, quality and usefulness of the very data that defines the Census Bureau’s mission and that demographers and statisticians rely on to draw a portrait of the nation’s communities.

The Census Bureau’s solution to protecting privacy is a new DAS based on a methodology referred to as Differential Privacy (DP). In brief, it functions by leveraging the same database reconstruction techniques that were used to diagnose the problem in the previous system: the 2020 DAS synthesizes a complete set of person- and household-level data records based on an extensive set of tabulations to which statistical noise has been added. Viewed as a continuum between total noise and total disclosure, the core of this method involves a determination regarding the amount of privacy loss or e, that can be accepted without compromising data privacy while ensuring the utility of the data. The key then becomes “where to set the dial”—set e too low and privacy is ensured at the cost of utility, but set e too high and utility is ensured but privacy in compromised. In addition to the overall level of e, its allocation over the content and detail of the census tabulations for 2020 is important. For example, specific block-level tabulations needed for redistricting may require a substantial allocation of the privacy-loss budget to achieve acceptable accuracy for this key use, but the cost is that accuracy of other important data (including for blocks, such as persons per household) will likely be compromised. Finding ways to resolve these difficult tradeoffs represents a serious challenge for the Census Bureau and users of its data.

The CNSTAT Workshop

In order to test how well this methodology worked in terms of the accuracy of noise-infused data, the Census Bureau issued special 2010 Census files subject to the 2020 DAS. The demonstration files applied the 2020 Census DAS to the 2010 Census confidential data — that is, the unprotected data from the 2010 Census that are not publicly available. The demonstration data permit scientific inquiry into the impact of DP. In addition, the Census commissioned the Committee on National Statistics (CNSTAT) of the National Academies of Sciences, Engineering and Medicine to host a 2-day Workshop on 2020 Census Data Products: Data Needs and Privacy Considerations, held in Washington, DC, on December 11-12, 2019. The two-fold purpose of the workshop was:

  • To assess the utility of the tabulations in the 2010 Demonstration Product for specific use cases/real-life data applications.
  • Generate constructive feedback for the Census Bureau that will be useful in setting the ultimate privacy loss budget and on the allocation of shares of that budget over the broad array of possible tables and geographic levels.

We both served as the co-chairs of the Committee that planned the Workshop. The Workshop brought together a diverse group of researchers who presented findings for a wide range of use cases that relied on data from past censuses.

These presentations, and the discussions surrounding them, provided a new set of evidence-based findings on the potential consequences of the Census Bureau’s new DAS. In what follows, we summarize “what we heard” or learned from the Workshop. This summary is ours alone; we do not speak for the Workshop’s Planning Committee, CNSTAT, or the Census Bureau. Nonetheless, we hope that the summary below provides the broader community of users of decennial census data with a better understanding of some of the potential consequences of the new DAS for the utility of the 2020 Census data products. Moreover we hope it fosters an on-going dialogue between the user community and the Census Bureau on ways to help ensure that data from the 2020 Census are of high quality, while still safeguarding the privacy and confidentiality of individual responses.

What We Heard

  • Population counts for some geographic units and demographic characteristics were not adversely affected by Differential Privacy (DP). Based on results presented at the Workshop, it appears that there were not, in general, differences in population counts between the 2010 demonstration file at some levels of geography. For the nation as a whole and for individual states, the Census’s algorithm, ensured that that counts were exact, i.e., counts at these levels were held invariant by design. Furthermore, the evidence presented also indicated that the counts in the demonstration products and those for actual 2010 data were not very different for geographic areas that received direct allocations of the privacy budget, including most counties, metro areas (aggregates of counties) and census tracts. Finally, for these geographic areas, the population counts by age in the demonstration products were fairly accurate when using broader age groupings (5-10 year groupings or broader ones), as well as for some demographic characteristics (e.g., for non-Hispanic whites, and sometimes for Hispanics).
  • Concerns with data for small geographic areas and units and certain population groups. At the same time, evidence presented at the Workshop indicated that most data for small geographic areas – especially census blocks – are not usable given the privacy-loss level used to produce the demonstration file. With some exceptions, applications demonstrated that the variability of small-area data (i.e., blocks, block groups, census tracts) compromised existing analyses. Many Workshop participants indicated that a larger privacy loss budget will be needed for the 2020 Census products to attain a minimum threshold of utility for small-area data. Alternatively, compromises in the content of the publicly-released products will be required to ensure greater accuracy for small areas.

The Census did not include a direct allocation of the privacy-loss budget 2010 demonstration file to all geographic areas, such as places and county subdivisions, or to detailed race groups, such as American Indians. As noted by numerous presenters, these units and groups are very important for many use cases, as they are the basis for political, legal, and administrative decision-making. Many of these cases involve small populations and local officials rely on the census as a key benchmark; in many cases, it defines who they are.

  • Problems for temporal consistency of population counts. Several presentations highlighted the problem of temporal inconsistency of counts, i.e., from one census to the next using DP. The analyses presented at the Workshop suggested that comparisons of 2010 Census data under the old DAS to 2020 Census data under DP may well show inexplicable trends, up or down, for small geographic areas and population groups. (And comparisons of 2030 data under DP with 2020 data under DP may also show inconsistencies over time). For example, when using counts as denominators to monitor disease rates or mortality at finer levels of geography by race, by old vs young, etc., the concern is that it will be difficult to determine real changes in population counts, and, thus, real trends in disease or mortality rates, versus the impact of using DP.
  • Unexpected issues with the post-processing of the proposed DAS. The Top-Down algorithm (TDA) employed by the Census Bureau in constructing the 2010 demonstration data produced histograms at different levels of geography that are, by design, unbiased —but they are not integers and include negative counts. The post-processing required to produce a microdata file capable of generating tabulations of persons and housing units with non-negative integer counts produced biases that are responsible for many anomalies observed in the tabulations. These are both systematic and problematic for many use cases. Additional complications arise from the need to hold some data cells invariant to change (e.g., total population at the state level) and from the separate processing of person and housing unit tabulations.

The application of DP to raw census data (the Census Edited File [CEF]) produces estimates that can be used to model error, but the post-processing adds a layer of complexity that may be very difficult to model, making the creation of “confidence intervals” problematic.

  • Implications for other Census Bureau data products. Important parts of the planned 2020 Census data products cannot be handled by the current 2020 DAS and TDA approach. They will be handled using different but as-yet-unspecified methods that will need to be consistent with the global privacy-loss budget for the 2020 Census. These products were not included in the demonstration files and were out of scope for the Workshop. Nonetheless, as noted by several presenters and participants in the Workshop, these decisions raise important issues for many users and use cases going forward. To what extent will content for detailed race/Hispanic/nationality groups be available, especially for American Indian and Alaska Native populations? To what degree will data on household-person combinations and within-household composition be possible under DAS?

For example, while the Census Bureau has stated that 2025 will be the target date for the possible application of DP to the ACS, they indicated that the population estimates program will be subject to DP immediately following 2020. These estimates would then then be used for weighting and post-stratification adjustments to the ACS.

  • Need plan to educate and provide guidance for users of the 2020 Census products. Regardless of what the Census Bureau decides with respect to ε and how it is allocated across tables, the Workshop participants made clear that a major re-education plan for data users’ needs to be put in place, with a focus on how best to describe key data and the shortcomings imposed by privacy considerations and error in general. Furthermore, as many at the Workshop voiced, such plans must be in place when the 2020 Census products are released to minimize major disruptions to and problems with the myriad uses made of these data and the decisions based on them.
  • Challenging privacy concerns and their potential consequences for the success of the 2020 Census. Finally, the Workshop included a panel of experts on privacy. These experts highlighted the disclosure risks associated with advances in linking information in public data sources, like the decennial census, with commercial data bases containing information on bankruptcies and credit card debt, driver licenses, and federal, state and local government databases on criminal offenses, public housing, and even citizenship status. While there are federal and state laws in place to protect the misuse of these governmental databases as well as the census (i.e., Title 13), their adequacy is challenged by advances in data linkage technologies and algorithms. And, as several panelists noted, these potential disclosure risks may well undercut the willingness of members of various groups – including immigrants (whether citizens or not), individuals violating public housing codes, or those at risk of domestic violence – to participate in the 2020 Census.

The Census Bureau has recently stated that it plans to have CNSTAT organize a follow-up set of expert meetings to “document improvements and overcome remaining challenges in the 2020 DAS.” In our view, such efforts, however they are organized, need to ensure meaningful involvement and feedback from the user community. Many within that community remain skeptical of the Bureau’s adoption of Differential Privacy and its consequences for their use cases. So, not only is it important that Census try to address the various problems identified by Workshop presenters and others who evaluated the 2010 demonstration products, it also is essential that follow-up activities are designed to involve a broader base of user communities in a meaningful way.

We encourage members of the census data user community to become engaged in this evaluation process, agreeing, if asked, to become involved in these follow-up efforts. Such efforts will be essential to help ensure that the Census Bureau meets its dual mandate of being the nation’s leading provider of quality information about its people and economy while safeguarding the privacy of those who provide this information.

2020 APDU Conference Call for Proposals

#Trending in 2020: Data Privacy, Accuracy, and Access

APDU is welcoming proposals on any topic related to the privacy, accuracy, and access of public data.  Proposals can be for a single presentation or panel, whether based on a particular project, data practice, or formal paper.  In keeping with the theme of the conference, our interest is in highlighting the breadth of public data to both producers and consumers of public data.  Some examples of topics might cover:

  • Privacy
    • Differential privacy and tiered data
    • State/local data privacy issues
    • Data Suppression
    • Corporate data privacy (ex. Facebook’s use of differential privacy)
  • Accuracy
    • Machine learning and the use of programming languages
    • How data accuracy will affect redistricting or federal allocations
    • Federal agencies data protection actions’ impact on other agency data
    • Synthetic or administrative data
    • Decennial Census
      • Citizenship question
      • Complete Count Committee
  •  Access
    • Future public data and policy developments
    • Current availability of public data (health, education, the economy, energy, the environment, climate, and other areas)
    • Federal statistical microdata such as ResearchDataGov
    • Federal Data Strategy updates and advocacy

Proposal Deadline: February 28, 2020.

You may submit ideas for a single presentation or a full panel (three presenters, plus a moderator). However, it is possible that we will accept portions of panel submissions to combine with other presenters. Submissions will be evaluated on the quality of work, relevance to APDU Conference attendees, uniqueness of topic and presenter, and thematic fit.

Please submit your proposal using the Survey Monkey collection window below.  Proposals will need to be submitted by members of APDU, and all presenters in a panel must register for the conference (full conference registration comes with a free APDU membership).  Proposers will be notified of our decision by March 13, 2020.

About APDU

The Association of Public Data Users (APDU) is a national network that links users, producers, and disseminators of government statistical data. APDU members share a vital concern about the collection, dissemination, preservation, and interpretation of public data.  The conference is in Arlington, VA on July 29-30, 2020, and brings together data users and data producers for conversations and presentations on a wide variety of data and statistical topics.

Create your own user feedback survey

The 2020 Census is Here and Businesses can Help

Companies make strategic decisions every day that rely on accurate data about customers, employees and markets. In the United States, the information gleaned from the decennial population census is an important ingredient in much of the data that companies use to make a range of decisions such as where to locate new stores/facilities, how to market products, and what services to offer customers. The federal government also uses census information to distribute more than $1.5 trillion for programs like roads, education and workforce development that help to strengthen the economy.

The next nationwide count starts in most of the country this March, and companies can help ensure its accuracy by encouraging employees and customers to participate.

Below are a series of resources from the US Census Bureau and ReadyNation that businesses and business membership organizations may find helpful when developing plans to support the count:

  • Newsletter language: Templates for (i) business organizations to engage their membership and (ii) companies to engage their employees.

FY20 Budget Moves from House to Senate

The House has passed appropriations bills to the Senate for FY2020, and there are important developments for statistical agencies. The Census Bureau, Bureau of Labor Statistics (BLS), and Bureau of Economic Analysis (BEA) each received modest to substantial increases in their budgets.

With massive increases in spending by the Census Bureau needed to successfully complete the Decennial Census, Congress appropriated $7.558B for the Census Bureau, with $274M for Current Surveys and Programs and $7.284B for Periodic Censuses and Programs. Importantly, this provides 6.696B for the Decennial Census, which is the minimum requested to complete the count effectively.

BEA received $107.9M, which assumes full funding for efforts to produce annual GDP for Puerto Rico. In addition, Congress apportioned $1.5M to the Outdoor Recreation Satellite Account, and $1M to develop income growth indicators.

After several years of flat funding, the BLS operational budget has been increased to $655M. This includes $587M for necessary expenses for the Bureau of Labor Statistics, including advances or reimbursements to State, Federal, and local agencies and their employees for services rendered, with no more than $68M that may be expended from the Employment Security Administration account in the Unemployment Trust Fund. This number includes $27M for the relocation of the BLS headquarters to the Suitland Federal Center and $13M for investments in BLS such as an annual supplement to the Current Population Survey on contingent work, restoration of certain Local Area Unemployment Statistics data, and development of a National Longitudinal Survey of Youth.

Webinar Q & A: How Will New Census Bureau Privacy Measures Change 2020 Decennial Census Data?

On November 18, APDU hosted a webinar on new measures taken by the Census Bureau to protect respondent privacy in the decennial census known as “differential privacy.” The webinar recording is available to APDU members in the member email and in a follow-up email to webinar registrants. Below are follow-up answers to questions from the question and answer portion of the webinar.

Has there been any discussion concerning cell specific error terms—akin to ACS MOEs seen in the summary files?

DVR answer – We do not know whether error terms will be provided with the Decennial census counts. Computing the error terms from the underlying data uses up part of the privacy loss budget. Census would have to decide whether that use of the privacy loss budget would be worth doing. If part of the privacy loss budget is used to compute error terms, the actual could will be more inaccurate. The Census Bureau recognizes the importance of such error terms – see https://arxiv.org/abs/1809.02201 for more details.

Do I understand correctly that a variable is invariant means that it will be reported as tabulated with no change?

That is correct. An “invariant” is a variable to which no noise will be added—it will be reported as enumerated (including any editing or imputation).

Is there any chance that the Bureau will realize that the cost/benefit of this is totally unacceptable? It seems like a massive over-reaction to me.

APDU has no formal position on this, but highly encourages all data users to submit their feedback to the Census Bureau’s email dcmd.2010.demonstration.data.products@census.gov

For more information about the comment process, see https://www.census.gov/programs-surveys/decennial-census/2020-census/planning-management/2020-census-data-products/2010-demonstration-data-products.html

Why is Illinois very high in the test tables? Is it because MCDs are a key part of the state’s political structure? Why wouldn’t New England states also be high in your tables, since MCDs are keys in those states?

Minor civil divisions are a fundamental part of Illinois’ political structure, and there are lots of MCDs with small populations. I bet that MCDs in New England have a larger populations, on average, than those in Illinois. Noise injection via differential privacy has a larger proportional impact on small populations. Thus, we see a larger fraction of Illinois MCDs with no vacant housing counts than we observe in New England states.

Will smaller geographies sum to larger ones, such as blocks to blockgroups?

Yes, smaller geographies will sum to larger units, such as blocks to block groups. The final output of the differential privacy algorithm is a set of microdata with block IDs on them. Tabulations derived from these microdata will sum up the geography hierarchy.

Why is a Laplace distribution used?

Technically the Census Bureau is using a geometric distribution, which allows the process to draw integer values for noise-introduction (and is similar to Laplace). Laplace is the current standard in differential privacy across the data privacy field. See the following two links for a more detailed discussion of Laplace vs. other symmetric distributions:

https://stats.stackexchange.com/questions/187410/what-is-the-purpose-of-using-a-laplacian-distribution-in-adding-noise-for-differ

https://www.johndcook.com/blog/2019/02/05/normal-approximation-to-laplace-distribution/

https://www.johndcook.com/blog/2017/09/20/adding-laplace-or-gaussian-noise-to-database/

Kathy’s slide 3 or 4 showed that one table that looks to be dropped for 2020 data products is HH by presence of nonrelatives. You also say Census may drop tables on young children at specific ages, such as 2 or 3. If these tables are dropped, research on the persistent and growing undercount of young children will be severely hampered. Households with nonrelatives are one of three types of complex households that have the highest correlation with young children who were originally missed in the 2010 Census, just some of whom were added back into the 2010 Census counts through the Census Followup Operation. The undercount of young children is a major issue that has been recognized by Congressional Committees, as well as the Census Bureau’s outside Advisory Committees, and Complete Count Committees. These are CRITICAL data for data users and policy makers. These tables are VERY much needed and we should urge the Census Bureau to provide these data!    

To be clear, there’s no definite decision on tables yet. The Census Bureau is proposing for the DHC tables to have a table on “SEX BY AGE FOR THE POPULATION UNDER 20 YEARS [43]” at the block level so there’ll be a count of children by specific ages, so I think that will meet your use case. One question is the needed geography for tables such as HOUSEHOLD TYPE BY RELATIONSHIP BY AGE FOR THE POPULATION UNDER 18 YEARS [36] (PCO9 in the new DHC) is proposed at the county level only, so may not meet the needs if people are using it at the tract level now.

Another issue is the importance of related children (which is not a category in the new tables) versus own children only. For example, related children are grouped with other non-related children in the PCO9 table. This is not my area of study but may be of concern to some people.

In any case, we encourage you to dig into the tables yourself and share your perspective with the Bureau. In addition to whether the tables are published, whether the data is appropriate for your use case will also depend on the level of accuracy of the numbers.  If you have a particular table of interest to your organization, we encourage you to take a look at the demonstration data and how it compares to SF1 values in 2010.

For information on how to submit comments to the Census Bureau, see https://www.census.gov/programs-surveys/decennial-census/2020-census/planning-management/2020-census-data-products/2010-demonstration-data-products.html