My First Time

 

Creating a CLI Gem. Nope, not that kind of blog. Sorry.

Ok.  That was fun, exciting and scary.  Not scary in a ‘OMG I’m going to die!’ way, but more of a ‘I don’t know WTF I’m doing…wait YES I DO…no I don’t!’ kind of thing.  I’m new to this whole coding thing and every new step in the learning process is a brand new experience. Being humble to the process and trusting that things will start clicking is a must. And they do start clicking. I should get the old clichéd adage, ‘Patience is a virtue’, tattooed on my fingers, so as I watch them type away, I can…well you get the point.  I’m getting faster at typing too, although I’ll probably need  a new ‘delete’ key soon.

Anywho, let’s get down to business. So this was the first gem I created. It wasn’t as painful as I thought it would be. Quite easy in fact, as most of the hard work has been done by the brave souls that came before me – learning Ruby from a manual, trudging their desktops through the snow barefoot. You know the story. Thank Google for those guys and girls. Things are much easier now. A bundle here. A gem there. And bam. Your very own gem. All you have to do is fill out your files with some fancy classes, scrape a website (which I find disturbingly fun), test it (many times in my case), and voila – an honest to Google CLI that works.

7NWB2A8I0R

It talks!

I’ll back up a bit and talk about the whole process or #!/usr/bin/env, if you will (a little Ruby humor…hehehe).  I had I few ideas about what I wanted to do my project on. My final decision came down to the scraping part of the project. Oh yeah, so I should tell you a little bit about the project. We were tasked with scraping a website for some info, with a second level of getting more detailed info about each option. I thought about the top 20 Vegan (yeah I’m one of those) restaurants in Los Angeles, but scraping Yelp looked like a no no. There’s a site called LAist.com that has top 20 things to do this weekend, but that link is always changing and my scraping mastery is not what it hopefully will be. So what else is there. Well, I’m in Los Angeles. What do people do here? Buy stuff? Eh. Botox injections? Eh. Come on!

Ok. People like the sun and we have lots of waaaater. Surfing!!! I’m not a surfer (maybe some day) but I thought it would be a fun project. A little Bing-ing around (HA!) and I came across site called surfline.com. Basic site with detailed forecasts of beaches around the world. Cool. Looked easy enough. Not so fast grasshopper. The scraping part wasn’t as easy as I thought it’d be. All the details I wanted were all in nested divs with no clear style elements. Quite the challenge. I learned quite a bit. I’m sure there are faster ways, but I did it (cue Frank Sinatra) MY WAY! Yay for me! The real test would be if it worked the next day. And it did. Pretty cool! Look ma! No hands! Here’s a look at my Scraper class:

Screen Shot 2016-04-11 at 7.27.42 PM

 

With the hard stuff out the way, the rest was fairly easy. Although it was almost finished by this point because I set up my classes before I worked on the scraper. Just connecting the dots from there. The run file is in the bin, which has this line: SurfReport::CLI.new.call. This takes us to our call method in the CLI class.

From there we are calling four methods:

  1. Our ‘make_days‘ method, which takes our array of hashes created in the Scraper class and passes that array to the Report class, which instantiates each day object with their corresponding instance variables. Those are all saved into the class method, all, which leads us to…
  2. Our ‘list_surf_reports‘ method, which iterates over our Report.all method and spits out a list for three days (today, tomorrow and the next), with a brief forecast.
  3. Our ‘menu‘ method comes next, which is the mock meat and potatoes of the program. Here we ask the user which day they want to check out. We get(s) that input and throw her into a plain old if/else statement. You get the idea.
  4. Last but not least, our ‘later‘ method. Instead of typing the industry standard, ‘exit’, we type ‘later’, cuz we’re way cool surfers.

And we’re like outta there man! In the future when my coding skills match my thinking, I’ll expand the scrape method to include searches for more cities and beaches. Check out my repo here if you have a minute. There’s a README file to get you started and surfing. Also a video of me explaining how to use the program. Thanks for reading about my first gem, and for that matter, my first blog post eva!  Now get off my WAAAAAVE!!!!

 

Advertisements