Install R Packages on Shiny Server Pro

If you’ve installed Shiny Server Pro as a user other than shiny, you might have experienced difficulty adding R packages. This is because Shiny Server Pro runs R as the shiny user, and running R -e “install.packages(‘foo’)” will install packages to the local user’s files only.

The solution to this is to su to the shiny user:

su - shiny

And run

R -e "install.packages('foo', repos='http://cran.rstudio.com')"

Alternatively, this script will parse an R file looking for require statements and install the necessary packages. It isn’t very smart, so be careful.

Install R Packages on Shiny Server Pro

What is Google foo.bar?

A week or two ago, the following popped up on my screen during a search for a Python-related topic:

You're speaking our language. Up for a challenge?

I had seen this before after our CTO got the same mysterious message a few months ago. We initially thought it was another one of Google’s Easter eggs, but a quick search revealed that everyone from HN and Reddit to Business Insider seems to think it’s a recruiting move by the search giant. (A similar program was rumored to be a search for cryptoanalyists, but turned out to be related to The Imitation Game, so who knows?)

Update: it is recruiting portal. Both of us were contacted by Google and interviewed on-site. The actual interview is under NDA, but I’ll post more about the interview process itself later.

The first time around, we discovered that replicating the query doesn’t necessarily trigger an invite, and visiting the URL without an invite doesn’t work. It was suggested that the invites are sent to a subset of users who have enabled search history. When I got the invite a week or two ago, I registered and then hit the “Back” button. The query string was preserved, so we tried an experiment: Is the invite based on a tagged query string, or the result of some back-end processing? After sending the URL to a couple of coworkers who had not received an invite after searching the same query, they tried accessing the URL directly. We learned two things:

  1. Both of them subsequently received an invite.
  2. One of them hit “refresh” as the animation began to show the box, and no invite was shown upon refresh. Opening the link in an Incognito window gave him a second chance.

The most likely scenario is that certain queries redirect to the results page with a query string, which triggers the message. Since neither of the other developers write lots of Python, but still got an invite after visiting the link, it’s likely that Google doesn’t validate invitee status. I doubt this is a simple oversight, and more likely indicates one of two things:

  1. Invitees are not on some sort of pre-selected list; and/or
  2. Google isn’t worried about additional invitees.

The latter was proven when the program displayed a “refer a friend” link. Assuming the recruitment theory is correct, it’s likely that Google is operating under the assumption that high-quality developers will refer other high-quality developers. I don’t know for sure, but this is probably a valid assumption.

To clarify some of the speculation, I was asked if I’d like a Google recruiter to contact me after completing the first six challenges.

Well, there goes that theory.

Others have asked Google directly about the program, and received a Python snippet that prints “glhf” in response – essentially “no comment”.

A Quick Tour

The pseudo-terminal responds to *nix commands like ls, cat and less and features its own editor. Listing the directory shows a textfile

Contents of start_here.txt
Contents of start_here.txt

The help menu offers several possible commands:

help

The levels consist of at least 5! challenges, split into 5 levels where each level n has challenges. Challenges fall into one of five categories, or tags.
Google Foobar Tags

Unfortunately, there has only been one crypto challenge available so far, and I haven’t been able to score a low_level challenge.  Most of the challenges I’ve completed so far involve one-off applications of computer science problems – like whiteboard interview questions with a twist. Additionally, there are constraints on execution time and memory use, which prevent some naive implementations from passing the test cases. This speaks to the needs of a company like Google who requires, or at least desires, efficient implementations rather than generic Algorithms 101 approaches.

I’ll be posting my solutions to GitHub shortly, along with some explanations here.

What is Google foo.bar?

EnvironmentError: “mysql_config not found” While Installing MySQL-Python

While running “pip install mysql-python” on a fresh installation of Linux Mint 17, the following error occured:

Traceback (most recent call last):
  File "", line 17, in 
  File "/tmp/pip_build_root/MySQL-python/setup.py", line 17, in 
    metadata, options = get_config()
  File "setup_posix.py", line 43, in get_config
    libs = mysql_config("libs_r")
  File "setup_posix.py", line 25, in mysql_config
    raise EnvironmentError("%s not found" % (mysql_config.path,))
EnvironmentError: mysql_config not found

Cause:

This problem is caused by the ‘mysql_config’ file not being in your PATH, likely because it’s not there at all.

Solution:

Ensure that the libmysqlclient-dev package is installed:

sudo apt-get install libmysqlclient-dev -y

If you are still getting the error after this, ensure that your MySQL library is in your path:

echo $PATH

If that’s still not working, you can edit the “setup_posix.py” file and change the path attribute to match your local installation:

mysql_config.path = "/path/to/mysql_config"

(Note that the python-MySQL can also be installed with apt-get install python-mysqldb)

EnvironmentError: “mysql_config not found” While Installing MySQL-Python

BlackBag Tool – A Framework for Rapid Information Discovery

Last Update: 14-Nov-2014

I’ve decided to pick up on the BlackBagTool project, which is an attempt at a program/framework to find interesting information on a mounted hard drive. The end-goal is an application that allows an investigator to gather a 2-minute summary of the information on the drive and act as a springboard for the overall investigation. This is an attempt at nailing down a spec.

Architecture

The layout consists of a series of Python modules and small scripts (installed to /usr/bin) that can be used in conjunction with each other. I’m debating whether or not to include an optional prefix on the command names for namespacing reasons.

The small, individual scripts can then be piped together or included in shell scripts to automate the discovery process. The python modules can also be imported into scripts or used in the REPL.

I’m also aiming to build an application around this set of tools that fully automates the task of:

  1. Take the mount directory as an argument
  2. Determine the operating system (based on files/paths/etc)
  3. Gather relevant OS files (/etc/shadow, ~/.bash_history, recent documents, etc)*
  4. Determine what applications are installed, and possibly which versions
  5. Gather relevant application data (recent files, configuration/settings, history, cookies, etc)
  6. Parse data according to known formats and process fields against known patterns (dates, email addresses, etc)

Email address in  tag.Interesting email addresses can be found in browser history Title fields.

Components:

  • dbxplorer – A module for automatically gathering information about databases on a computer (db files, tables, raw data). Working on support for MySQL and SQLite now.
  • fsxplorer – A module for filesystem scanning.
  • bbtutils – A utility module for gathering information in a consistent way
  • skypedump – A utility for dumping skype information (contacts, chat history, etc)
  • chromedump – A utility for dumping browser information from Google Chrome (history, downloads, favorites, cookies, autofill data, etc)
BlackBag Tool – A Framework for Rapid Information Discovery