From “ModuleNotFoundError” to a polished CLI tool with color-coded output and JSON export
Introduction
When studying for the Google Cloud Associate Cloud Engineer certification, I realized that reading documentation only gets you so far. The real learning happens when you build something. So I decided to create a Python tool that would help me understand one of the exam’s core topics: the GCP resource hierarchy.
This post documents my journey building a Resource Hierarchy Explorer—a command-line tool that authenticates with Google Cloud, lists all accessible projects, checks enabled APIs, shows billing information, and exports everything to JSON.
Along the way, I hit real-world obstacles: missing dependencies, permission errors, and API access issues. Each problem taught me something valuable about how GCP actually works.
Exam Mapping This project directly reinforces ACE Exam Section 1.1: "Setting up cloud projects, enabling APIs, resource hierarchy." The
resourcemanager_v3client taught me project enumeration,service_usage_v1showed me API enablement patterns, and debugging permission errors gave me hands-on IAM experience.
Quick Start
If you just want to run the tool:
# Prerequisites (one-time setup)
gcloud auth application-default login
pip install google-cloud-resource-manager google-cloud-billing google-cloud-service-usage
# Run it
python gcp_resource_hierarchy_explorer.py
# Output: colored CLI display + gcp_hierarchy_YYYYMMDD_HHMMSS.json in current directoryRequired Permissions Your account needs these permissions (typically granted via
roles/viewer):
resourcemanager.projects.get— list projectsserviceusage.services.list— check enabled APIsbilling.resourceAssociations.list— view billing linkageIf you see 403 errors, check your IAM roles in the Cloud Console.
Why Not Just Use gcloud?
A fair question: why write Python when gcloud projects list exists?
While gcloud is powerful, wrapping this in Python allows me to:
- Aggregate data (billing + APIs + project state) that would otherwise require three separate shell commands and complex piping
- Export everything to a clean JSON for auditing and documentation
- Add formatting and visual polish that makes the output actually readable
- Learn the SDK — which is exactly what I need for the certification
Google positions the client libraries as the right choice when you need to build automation or integrate GCP data into larger workflows—exactly this use case.
The Goal
Based on exam topic 1.1, I set out to build a tool with these features:
| Feature | Purpose | Exam Relevance |
|---|---|---|
| Authenticate using ADC | Learn how ADC works in practice | |
| List all accessible projects | Understand project enumeration | 1.1: Resource hierarchy |
| Check enabled APIs per project | See what services are active | 1.1: Enabling APIs |
| Show billing account linkage | Understand project-billing relationships | 1.1: Project setup |
Scope: Project-Level Focus The final script focuses on project state, billing, and API enumeration. Full organization/folder tree traversal (using
FoldersClient) is left as future work—my personal Gmail account has a flat structure anyway, so there was nothing to traverse.
Architecture Overview
Here’s the logical flow of the final script:
graph TD A[Start Script] --> B{ADC Auth Valid?} B -- No --> C[Error: Run gcloud auth] B -- Yes --> D[List Projects via Resource Manager API] D --> E[Loop Through Each Project] E --> F{Project State?} F -- ACTIVE --> G[Fetch Enabled APIs] G --> H[Fetch Billing Info] H --> I[Add to Results] F -- DELETE_REQUESTED --> J[Skip API/Billing Calls] J --> I I --> K{More Projects?} K -- Yes --> E K -- No --> L[Display Formatted Output] L --> M[Export to JSON] M --> N[Done]
Where Retries Would Go For production use with large organizations, you'd add exponential backoff between steps G→H and H→I. See Google's retry guidance for the recommended pattern.
Setting Up the Environment
Prerequisites
Before writing any code, I needed to set up my local development environment:
# Initialize gcloud and authenticate
gcloud init
gcloud auth application-default login
# Create a Python virtual environment
python -m venv gcp-automation
source gcp-automation/bin/activate
# Install required packages
pip install google-cloud-resource-manager google-cloud-billing google-cloud-service-usage
# Freeze dependencies for reproducibility
pip freeze > requirements.txtThe gcloud auth application-default login command is crucial—it creates credentials that Python’s Google Cloud SDK automatically discovers and uses. This is ADC in action.
Reproducibility I froze my dependencies with
pip freeze > requirements.txtso anyone can recreate this environment withpip install -r requirements.txt.
Iteration 1: The Simplest Possible Script
I started with the most basic version—just list projects:
from google.cloud import resourcemanager_v3
def main():
projects_client = resourcemanager_v3.ProjectsClient()
print("Projects you can access:")
for project in projects_client.search_projects():
print(f" - {project.display_name} ({project.project_id})")
if __name__ == "__main__":
main()Exam Relevance: ACE 1.1
search_projects()is the v3 API pattern for enumerating projects. This directly maps to understanding how the resource hierarchy exposes projects to authenticated principals.
First Bug: ModuleNotFoundError
When I ran it:
Traceback (most recent call last):
File "gcp_resource_hierarchy_explorer.py", line 1, in <module>
from google.cloud import resourcemanager_v3
ModuleNotFoundError: No module named 'google'
The problem: I wasn’t in my virtual environment. The packages were installed in gcp-automation, but I was running Python from the system installation.
The fix: Activate the virtual environment first:
source gcp-automation/bin/activate
python3 gcp_resource_hierarchy_explorer.pyLesson Learned: Virtual Environments Always check that your virtual environment is active. Look for
(gcp-automation)in your terminal prompt before running scripts.
Success!
After activating the environment:
Projects you can access:
- garden-izzy-sh-ace (garden-izzy-sh-ace)
- izzys-garden-2026 (izzys-garden-2026)
- Izzy Garden (izzy-garden-production)
- gcp-izzys-garden-sh (gcp-izzys-garden-sh)
- garden-izzy-sh (garden-izzy-sh)
- izzys-garden (izzys-garden)
- izzy (gen-lang-client-0538135040)
- gcp-ace (gcp-ace-480302)
Eight projects! ADC was working, and the Resource Manager API was returning results.
Iteration 2: Adding Hierarchy Context
Next, I wanted to see where each project lived in the GCP Resource Hierarchy. Projects can belong to:
- An Organization (top level, requires Google Workspace or Cloud Identity domain)
- A Folder (for grouping projects within an org)
- Nothing (standalone projects, common with personal Gmail accounts)
I wrote code to parse the parent field and resolve folder names:
def get_parent_info(parent_string):
"""Parse parent string like 'folders/123' or 'organizations/456'"""
if not parent_string:
return None, None
parts = parent_string.split('/')
if len(parts) == 2:
return parts[0], parts[1]
return None, NoneOutput
📋 garden-izzy-sh-ace
ID: garden-izzy-sh-ace
Parent: 📍 No parent (standalone)
All my projects showed as “standalone”—which makes sense for a personal Google account without a Workspace domain.
Lesson Learned: Account Types Matter The hierarchy structure depends on your account type. Personal Gmail accounts typically have flat, standalone projects. Enterprise accounts with Google Workspace or Cloud Identity have Organizations → Folders → Projects. You need one of those identity providers to even have an organization.
Since my account has no org/folder structure, I simplified the final script to focus on project-level data. Full tree traversal with FoldersClient would be future work for enterprise use cases.
Iteration 3: Checking Enabled APIs
Now for the most useful feature: seeing which APIs are enabled on each project.
from google.cloud import service_usage_v1
def get_enabled_apis(project_id):
"""Get list of enabled APIs for a project"""
client = service_usage_v1.ServiceUsageClient()
enabled_services = []
try:
request = service_usage_v1.ListServicesRequest(
parent=f"projects/{project_id}",
filter="state:ENABLED"
)
for service in client.list_services(request=request):
service_name = service.config.name
enabled_services.append(service_name)
except Exception as e:
# In production, log the exception type and HTTP status for debugging
print(f"Warning: Could not fetch APIs for {project_id}: {type(e).__name__}")
return None
return enabled_servicesExam Relevance: ACE 1.1 The
filter="state:ENABLED"parameter is the documented pattern for listing only active APIs. This maps directly to understanding API enablement as a prerequisite for using GCP services.
Interesting Findings
The output revealed some useful information:
📋 gcp-izzys-garden-sh
ID: gcp-izzys-garden-sh
State: DELETE_REQUESTED
Enabled APIs:
⚠️ Error: 403 Project '99253127702' not found or permission denied.
Two of my projects were in DELETE_REQUESTED state, and I couldn’t query their APIs.
Lesson Learned: Project State Affects Access Once a project is pending deletion, API access is revoked. The Resource Manager API still lists these projects, but you can't interact with their services. This is documented behavior—Resource Manager shows the project exists, but other APIs treat it as inaccessible.
Iteration 4: Adding Billing Information
The google-cloud-billing package lets you check which billing account is linked to each project:
from google.cloud import billing_v1
def get_billing_info(project_id):
"""Get billing account info for a project"""
client = billing_v1.CloudBillingClient()
try:
# The name format is documented in the API reference
name = f"projects/{project_id}"
billing_info = client.get_project_billing_info(name=name)
if billing_info.billing_enabled:
return {
"enabled": True,
"billing_account": billing_info.billing_account_name,
}
else:
return {"enabled": False, "billing_account": None}
except Exception as e:
# Log exception type for debugging; HTTP 403 usually means missing permission
print(f"Warning: Billing check failed for {project_id}: {type(e).__name__}")
return {"enabled": None, "error": str(e)[:50]}Exam Relevance: ACE 1.1 Understanding billing-project linkage is part of project setup. A project without billing can't use paid services—this is a common gotcha in the real world.
Another Permission Error
Initially, all projects showed “Unable to access” for billing. The fix was enabling the Cloud Billing API:
gcloud services enable cloudbilling.googleapis.com --project=gcp-ace-480302After that:
📋 garden-izzy-sh-ace
ID: garden-izzy-sh-ace
💳 Billing: 0190AD-CB8773-AA42D1
APIs: 23 enabled
Lesson Learned: APIs Must Be Enabled Each Google Cloud API must be explicitly enabled before you can use it programmatically. This is a security feature—you opt in to what your project can access. Even "read-only" operations like checking billing status require the relevant API to be enabled.
Iteration 5: JSON Export
For documentation and auditing, I added JSON export:
import json
from datetime import datetime
def export_to_json(projects_data, filename=None):
"""Export project data to JSON file"""
if filename is None:
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"gcp_hierarchy_{timestamp}.json"
export_data = {
"exported_at": datetime.now().isoformat(),
"total_projects": len(projects_data),
"active_projects": len([p for p in projects_data if p["state"] == "ACTIVE"]),
"projects": projects_data
}
with open(filename, "w") as f:
json.dump(export_data, f, indent=2)
return filenameThis creates timestamped exports like gcp_hierarchy_20260109_021729.json—useful for tracking changes over time or feeding into other automation.
Security: Sensitive Data in Exports These JSON exports contain infrastructure details (project IDs, billing account IDs). Billing account IDs are considered sensitive in many organizations—they can reveal organizational structure and financial relationships. I added
gcp_hierarchy_*.jsonto my.gitignoreto prevent accidentally pushing internal data to GitHub.
Iteration 6: Polish with Colors and Formatting
Finally, I added visual polish using ANSI escape codes:
class Colors:
HEADER = '\033[95m'
BLUE = '\033[94m'
CYAN = '\033[96m'
GREEN = '\033[92m'
YELLOW = '\033[93m'
RED = '\033[91m'
BOLD = '\033[1m'
DIM = '\033[2m'
RESET = '\033[0m'With tree-style formatting using box-drawing characters, the output looks like:
╔══════════════════════════════════════════════════════════╗
║ ☁️ GCP Resource Hierarchy Explorer ☁️ ║
╚══════════════════════════════════════════════════════════╝
┌─────────────────────────────────────────────┐
│ 🟢 ACTIVE PROJECTS (6) │
└─────────────────────────────────────────────┘
📋 garden-izzy-sh-ace
├── ID: garden-izzy-sh-ace
├── 💳 Billing: 0190AD-CB8773-AA42D1
└── 🔌 APIs: 23 enabled
✓ analyticshub.googleapis.com
✓ bigquery.googleapis.com
...
I also added a progress indicator showing which project is being processed:
print(f"{c.DIM}Collecting project data...{c.RESET}")For Production: Use the Rich Library While I enjoyed manually handling ANSI codes (it's great for understanding how terminal formatting works!), for a production tool I would use the Rich library. It handles terminal formatting, tables, progress bars, syntax highlighting, and graceful fallback for terminals that don't support colors.
The Final Script
The complete script is approximately 150 lines. Rather than inline everything here, I’ve made it available as a gist:
📁 gcp_resource_hierarchy_explorer.py (replace with your actual gist link)
Full Script
import json
from datetime import datetime
from google.cloud import resourcemanager_v3
from google.cloud import service_usage_v1
from google.cloud import billing_v1
class Colors:
"""ANSI color codes for terminal output"""
HEADER = '\033[95m'
BLUE = '\033[94m'
CYAN = '\033[96m'
GREEN = '\033[92m'
YELLOW = '\033[93m'
RED = '\033[91m'
BOLD = '\033[1m'
DIM = '\033[2m'
RESET = '\033[0m'
def get_enabled_apis(project_id):
"""Get list of enabled APIs for a project"""
client = service_usage_v1.ServiceUsageClient()
enabled_services = []
try:
request = service_usage_v1.ListServicesRequest(
parent=f"projects/{project_id}",
filter="state:ENABLED"
)
for service in client.list_services(request=request):
enabled_services.append(service.config.name)
except Exception as e:
print(f" Warning: API fetch failed for {project_id}: {type(e).__name__}")
return None
return enabled_services
def get_billing_info(project_id):
"""Get billing account info for a project"""
client = billing_v1.CloudBillingClient()
try:
name = f"projects/{project_id}"
billing_info = client.get_project_billing_info(name=name)
if billing_info.billing_enabled:
return {
"enabled": True,
"billing_account": billing_info.billing_account_name,
}
else:
return {"enabled": False, "billing_account": None}
except Exception as e:
print(f" Warning: Billing fetch failed for {project_id}: {type(e).__name__}")
return {"enabled": None, "error": str(e)[:50]}
def collect_project_data():
"""Collect all project data into a structured format"""
projects_client = resourcemanager_v3.ProjectsClient()
projects_data = []
for project in projects_client.search_projects():
project_info = {
"display_name": project.display_name,
"project_id": project.project_id,
"state": project.state.name,
"parent": project.parent if project.parent else None,
}
if project.state.name == "ACTIVE":
project_info["apis"] = get_enabled_apis(project.project_id)
project_info["billing"] = get_billing_info(project.project_id)
else:
project_info["apis"] = None
project_info["billing"] = None
projects_data.append(project_info)
return projects_data
def print_header():
"""Print a fancy header"""
c = Colors
print()
print(f"{c.CYAN}{c.BOLD}╔══════════════════════════════════════════════════════════╗{c.RESET}")
print(f"{c.CYAN}{c.BOLD}║{c.RESET} {c.BOLD}☁️ GCP Resource Hierarchy Explorer ☁️{c.RESET} {c.CYAN}{c.BOLD}║{c.RESET}")
print(f"{c.CYAN}{c.BOLD}╚══════════════════════════════════════════════════════════╝{c.RESET}")
print()
def print_section(title, color, count=None):
"""Print a section header"""
c = Colors
count_str = f" ({count})" if count is not None else ""
full_title = f"{title}{count_str}"
print(f"\n{color}{c.BOLD}┌{'─' * 45}┐{c.RESET}")
print(f"{color}{c.BOLD}│ {full_title:<43} │{c.RESET}")
print(f"{color}{c.BOLD}└{'─' * 45}┘{c.RESET}")
def display_results(projects_data):
"""Display results with colors and formatting"""
c = Colors
print_header()
active = [p for p in projects_data if p["state"] == "ACTIVE"]
inactive = [p for p in projects_data if p["state"] != "ACTIVE"]
print_section("🟢 ACTIVE PROJECTS", c.GREEN, len(active))
for project in active:
print(f"\n{c.BOLD}{c.BLUE} 📋 {project['display_name']}{c.RESET}")
print(f"{c.DIM} ├── ID:{c.RESET} {project['project_id']}")
billing = project.get("billing", {})
if billing and billing.get("enabled"):
account_id = billing["billing_account"].split("/")[-1]
print(f"{c.DIM} ├── 💳 Billing:{c.RESET} {c.GREEN}{account_id}{c.RESET}")
elif billing and billing.get("enabled") is False:
print(f"{c.DIM} ├── 💳 Billing:{c.RESET} {c.YELLOW}Not enabled{c.RESET}")
else:
print(f"{c.DIM} ├── 💳 Billing:{c.RESET} {c.RED}Unable to access{c.RESET}")
apis = project.get("apis")
if apis is None:
print(f"{c.DIM} └── 🔌 APIs:{c.RESET} {c.RED}⚠️ Unable to access{c.RESET}")
elif not apis:
print(f"{c.DIM} └── 🔌 APIs:{c.RESET} {c.YELLOW}(none enabled){c.RESET}")
else:
api_color = c.GREEN if len(apis) > 20 else c.CYAN if len(apis) > 5 else c.YELLOW
print(f"{c.DIM} └── 🔌 APIs:{c.RESET} {api_color}{len(apis)} enabled{c.RESET}")
for api in apis[:5]:
print(f"{c.DIM} ✓ {api}{c.RESET}")
if len(apis) > 5:
print(f"{c.DIM} ... and {len(apis) - 5} more{c.RESET}")
if inactive:
print_section("🔴 INACTIVE PROJECTS", c.RED, len(inactive))
for project in inactive:
print(f"{c.DIM} ⏳ {project['display_name']} ({project['project_id']}) - {c.YELLOW}{project['state']}{c.RESET}")
print(f"\n{c.CYAN}{c.BOLD}{'═' * 60}{c.RESET}")
def export_to_json(projects_data, filename=None):
"""Export project data to JSON file"""
if filename is None:
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"gcp_hierarchy_{timestamp}.json"
export_data = {
"exported_at": datetime.now().isoformat(),
"total_projects": len(projects_data),
"active_projects": len([p for p in projects_data if p["state"] == "ACTIVE"]),
"projects": projects_data
}
with open(filename, "w") as f:
json.dump(export_data, f, indent=2)
return filename
def main():
c = Colors
print(f"{c.DIM}Collecting project data...{c.RESET}")
projects_data = collect_project_data()
display_results(projects_data)
filename = export_to_json(projects_data)
print(f"{c.GREEN}✅ Data exported to:{c.RESET} {c.BOLD}{filename}{c.RESET}")
print(f"{c.CYAN}{c.BOLD}{'═' * 60}{c.RESET}\n")
if __name__ == "__main__":
main()Key Takeaways
Technical Lessons
Summary of Lessons
- Virtual environments matter. Always activate your venv before running scripts.
- ADC simplifies authentication. One
gcloud authcommand enables all your scripts.- APIs must be enabled explicitly. This is a security feature, not a bug.
- Project state affects access.
DELETE_REQUESTEDprojects can’t be queried for services.- Permission errors are informative. A 403 tells you exactly what’s missing—check IAM roles.
GCP Concepts Reinforced
| Concept | What I Learned | Exam Section |
|---|---|---|
| GCP Resource Hierarchy | Resource Hierarchy]] | Personal accounts have flat structures; orgs require Workspace/Cloud Identity |
| Application Default Credentials | The SDK automatically finds credentials from gcloud auth | 1.1 |
| Service Usage API | Lists enabled APIs; requires serviceusage.googleapis.com | 1.1 |
| Cloud Billing API | Shows billing account linkage; requires cloudbilling.googleapis.com | 1.1 |
| Project States | ACTIVE, DELETE_REQUESTED, DELETE_IN_PROGRESS affect what you can access | 1.1 |
These tables are essentially flashcards—I can lift them directly into my ACE Exam Notes for review.
Scaling Considerations
Rate Limits for Large Organizations The Google Cloud client libraries handle pagination automatically for small datasets. However, for an organization with thousands of projects, you may hit API rate limits. In that case, you'd need to:
- Handle page tokens explicitly
- Add exponential backoff for retries (see Google’s retry guidance)
- Consider batching requests or running during off-peak hours
For my personal account with 8 projects, this wasn’t an issue—but it’s worth knowing for enterprise use cases.
What’s Next?
This tool could be extended with:
- Full hierarchy traversal — Use
FoldersClientto build an actual org tree for enterprise accounts - IAM policy analysis — Show who has access to each project
- Cost estimation — Pull billing data for each project
- Resource inventory — List compute instances, storage buckets, etc.
- Drift detection — Compare current state to a baseline JSON export
But for now, this covers the fundamentals of GCP exam topic 1.1. The best way to learn cloud infrastructure is to build tools that interact with it.
Resources
- Google Cloud Resource Manager API
- Application Default Credentials
- Cloud Billing API
- Service Usage API — List Services
- Rich Library for Python — Better terminal formatting
- Google’s Retry Guidance — For handling rate limits
Happy cloud exploring! ☁️
Related
- ACE Certification Plan — Full study plan
- gcp-overview — Why choose GCP
- gcp-learning-path — GCP learning roadmap
- gcp-resources — Courses and labs
- google-compute-engine — Deep dive into GCE
- GCE Mastery Roadmap — 20 hands-on projects
- Python for DevOps — Python automation basics