Having been fortunate enough to visit over 30 national parks over the last few years, when I learned of the NPS API, I thought it would make for an interesting data collection project for exploring AI code generation tools. The current version of my project can be found on Github: https://github.com/seanangio/nps-hikes
I started by sending the same initial prompt outlining a Python data pipeline to ChatGPT and Claude. While ChatGPT gave me one procedural script, Claude structured the code as a class. I started going back and forth with Claude in the browser, slowly copying and pasting, and asking for explanations of every detail.
Soon I moved to Cursor, as well as a short trial of Claude CLI. As a VS Code user, I found them both to be natural extensions. With the AI able to see my actual files, I no longer needed to copy-paste back and forth. I also found myself granting more and more autonomy to the agent.
While it certainly has its hallucinations, and limitations if I was working on something mission-critical, for a personal project, it’s remarkable what it can do to build something from scratch. For example, instead of researching the syntax for API endpoints myself, Cursor’s agent was able to do this research for me, finding the right endpoint to access the USGS National Map data.
With very little effort, I could set up a profiling module to create exploratory visualizations of each park, such as those that map the trails within a park’s boundaries:

Or those that depict the trail elevation profiles:

To get even more out of such tools, I have more to learn about context engineering. For my modest project, the entire codebase is indexed. Other than keeping an eye on the context utilization of long chats, I mostly didn’t need to worry too much about the exact context Cursor has for any task. Defining project rules is another feature I haven’t fully utilized.
One of the most interesting aspects of the project was integrating a Postgres database via a MCP server. With just a few prompts, the installation was finished, and I was getting real-time results from the database through natural language.
