Our shul is in the process of switching to In-Person and Zoom based services from being strictly Zoom based (whoo!). To power this process, we're making use of ShulCloud's forms. Naturally, I wasn't excited about logging into ShulCloud's interface to work with these forms, so I decided to add on to the command line tool I'd previously built.
The shulcloudassist tool, which you can find on github, has the following new options:
$ shulcloudassist -a forms ...generate a command line friendly list of all forms... $ shulcloudassist -a export-form -i 94848 ...dump the submissions for form 94848 as CSV...
ShulCloud offers a form submission export URL that I was able to tap into without issue. Using this URL and theauthentication functionality I'd previously built, it was trivial to pull the form submissions given a form ID.
Getting the list of forms, however, proved to be trickier. There wasn't a CSV export option that I could find that gave me this info. It looked like my best bet was to scrape the HTML <table> that contained the list of forms.
To accomplish this, I used a new tool: pup. Pup is inspired by jq, a tool that makes working with JSON a far simpler process. In fact, pup will output JSON, no doubt for further processing by jq.
It took a little experimentation, but ultimately, I got pup to do what I wanted. For example, to pull out the 3rd form's name, I run the following pup command:
curl -s -G ... $BASE/admin/forms.php > $DOC_SNAPSHOT name=$(cat $DOC_SNAPSHOT | \ pup "table.listing tr:nth-child(3) td:nth-child(1) text{}") echo $name
By making creative use of nth-child(...) I can navigate the HTML document and get back just the data I need.
While the process is messy, it does seem reliable. And any tool that brings data to the command line is a win in my book.
No comments:
Post a Comment