Automation helps with boring and tedious work, save time and energy(Of course if you doing it right). Automation and task scheduling in Linux is done with daemon called crontab (CRON for short). The definition below best describes the tool:
cron is a Unix utility that allows tasks to be automatically run in the background at regular intervals by the cron daemon.
In this post you can find several examples of cron and bash scripts in order to automate some tasks like:
- Crontab for beginners
- Start random song from your Linux machine
- Start motivation/relaxation video at given hour
- Open random text file at Linux Mint start
- Scrape webdata and load the LibreOffice ( or csv, excel etc)
- Generate simple report as doc file, create jupyter notebook and send mail
All examples are written and tested on Linux Mint 19 Cinnamon. I plan additional articles and videos for some of the topics above.
More articles on automation:
- Automate tasks in Windows and Linux
- Start and stop application from terminal in Linux Mint 19 / Ubuntu 18.04
Crontab for beginners
In order to schedule jobs for your user you need to write:
crontab -e
description of this command is:
Edit crontab file, or create one if it doesn’t already exist
Now you can list the scheduled jobs in Linux Mint / Ubuntu by:
crontab -l
This command will:
crontab list of cronjobs , display crontab file contents.
Example cron job:
0 16 * * * DISPLAY=:0 thunderbird %u
this will open Thunderbird mail client at 16:00.
More examples in tabular presentation:
minute | hour | day | month | day (week) | command |
---|---|---|---|---|---|
0 | 16 | * | * | * | DISPLAY=:0 thunderbird %u |
0 | 9-15 | * | * | * | DISPLAY=:0 ~/motivation.sh |
0 | 16 | * | * | WED,SAT | DISPLAY=:0 ~/motivation.sh |
0 | 17 | 1,7 | * | * | DISPLAY=:0 ~/motivation.sh |
- run the command every day at 16:00
- run the command from 9:00 to 15:00 everyday
- run the command at 16:00 only Wednesday and Saturday
- run the command at 17:00 on 1st and 7th of the month
Start random song from your machine
In order to be productive and keep your health you need to do regular breaks. Working on a computer make this habit difficult and you can easily lose impression for time. In order to have regular breaks and get some motivation a simple script can be used to play music at every hour.
Below you can find the simple script which selects random song from a folder, then start a player with the song and finally close the player.
#!/bin/bash
MUSIC=/mnt/x/Music/Motivation/
cd $MUSIC
ls |sort -R |tail -1 |while read file; do
echo "$MUSIC$file"
rhythmbox "$MUSIC$file" & sleep 5m
done
killall rhythmbox
Short explanation of this script:
#!/bin/bash
- this is a convention so the Unix shell knows what kind of interpreter to runMUSIC=/mnt/x/Music/Motivation/
- next is the folder which contains the songs from which a random one will be chosen. Define a variable with the folder and change the current folder to itls |sort -R |tail -1 |while read file; do
- select a random song and assign it to variable file which will be used in next statementrhythmbox "$MUSIC$file" & sleep 5m
- start the music player with the selected song and keep the script for 5 minutes. The maximum length of song in this folderkillall rhythmbox
- finally kill the music player
Once you have the script(for example name it motivation.sh, place it in your home and allow it's execution) you can easily schedule new cron job to run by:
- open new terminal with CTRL + ALT +T
- type:
crontab -e
- add at the end of the file:
0 9-17 * * * DISPLAY=:0 /home/user/cron/motivation.sh
if you want the songs to be played every hour between 9 to 17. Happy listening!
Backed by science:
In 2005, research from the journal of Psychology of Music showed that software developers experienced more positive moods, better quality of work and improved efficiency when listening to music.
The Science Backed Ways Music Affects Your Brain and Productivity
Start motivation/relaxation video at given hour
Next example will demonstrate how to get pause for relax at your work with few lines of code and crontab daemon. This example helps me to schedule a meditation video and relax for 10 minutes several times per day. Basically you can use any video that you like or song- all depends on your personal preferences.
Usually I'm using meditation music or yoga videos shared by friends. If you don't have anything good in mind then you can download any video from Youtube and use it(just search for relaxing music or chill out).
The script is:
#!/bin/bash
vlc -f /home/user/Videos/Relaxing.mp4 & sleep 10m
killall vlc
- save the script as motivation.sh in folder
/home/user/cron/motivation.sh
- or as you prefer - Start linux terminal
- type:
crontab -e
- add at the end of the file:
0 10 * * * DISPLAY=:0 /home/user/cron/motivation.sh
This will start VLC player with selected video at 10 in the morning. Will keep it running for 10 minutes and finally kill the player. So I get my break at 10:00 every working day.
How to download youtube video in Ubuntu / Linux Mint:
- install youtube-dl by
sudo apt-get install youtube-dl
- download video by:
youtube-dl https://www.youtube.com/watch?v=mkKDI6y2kyE
Bonus: For homework you can try to find out how to download multiple videos and update the script in order to play random one.
Backed by science:
- regular breaks helps with:
- Increased productivity
- Improved mental well-being
- Creativity boost
- More time for healthy habits
Source: New Study Shows Correlation Between Employee Engagement And The Long-Lost Lunch Break
Open random text file at Linux Mint start
This is example demonstrates how to automate task at the start of your computer. For me it's very easy to like an article or saying but even easier to forgot about it. So I have a script which loads random piece of text at the start of the machine. In this way I can keep few good reads from a book, article or my to do list :)
Below you can find the script for the automation:
cd /home/user/Documents/Books/Motivation
ls |sort -R |tail -1 |while read file; do
xed $file
done
: '
This script open random text file
'
This script show similar technique as the first example. There are two new things in it:
xed
- using a xed text editor in order to open the text file- demonstrating how to use bash comments in order to get meaningful information for your scripts
Finally in order to schedule it for the start of your Machine you can do:
- terminal
- type:
crontab -e
and then:
@reboot /home/user/cron/openRandomText.sh
Or you can use the system start applications like (for Linux Mint):
- Main Menu
- Preferences
- Startup Applications
- Add ( the plus button)
- Custom command and type:
- Name
- Command
- Startup Delay
Now you will have all important notes before starting your working day (or any other day).
Backed by Science:
We remember things because they either stand out, they relate to and can easily be integrated in our existing knowledge base, or it’s something we retrieve, recount or use repeatedly over time
Source: How to improve your memory, according to neuroscience
Scrape webdata and load the LibreOffice ( or csv, excel etc)
Web data is very important and precious nowadays. Collection is done fast, reliably and in automated manner with special programs - called spiders or scrapers. To write a personal spider is very easy by using modules like scrapy. I'm using spiders when I need to buy something and regularly checking the prices of the targeted product - in this way I can get really good promotion.
Below you can find very simple spider crawling website and collecting data:
import scrapy
class QuotesSpider(scrapy.Spider):
name = "quotes"
def start_requests(self):
urls = [
'http://quotes.toscrape.com/page/1/',
'http://quotes.toscrape.com/page/2/',
]
for url in urls:
yield scrapy.Request(url=url, callback=self.parse)
def parse(self, response):
page = response.url.split("/")[-2]
filename = 'quotes-%s.html' % page
with open(filename, 'wb') as f:
f.write(response.body)
self.log('Saved file %s' % filename)
You can find more info at: Scrapy Tutorial
once your spider is complete you can create script which to automate its execution. Below you can find an example script:
#!/bin/bash
cd /home/user/environments/venv36
source ./bin/activate
cd /home/user/PycharmProjects/python/src/spiders/
now=$(date +"_%Y-%m-%d")
scrapy runspider my_spider.py -o - >data/my_spider/my_spider$now.csv -t csv
python helpers/pandas_after.py
libreoffice --calc /home/user/data/my_spider$now.csv
Now lets talk a bit about the code above:
- first two lines are used to activate special python environment for this spider
- then working folder is changed to the spider project
- the line below run the spider and save the output as csv file
scrapy runspider my_spider.py -o - >data/my_spider/my_spider$now.csv -t csv
- command
python helpers/pandas_after.py
- call pandas function which clean, add new columns and sort data - the final line open the newly created CSV file in Libre Office
This script is scheduled on my machine with the following crontab:
55 9 * * * DISPLAY=:0 /home/user/cron/spider_script.sh
P.S. If you delete (incidentally by crontab -r) your crontab jobs they can be restored (if they were running) by checking the logs:
grep 'CRON.*username' /var/log/syslog
where username is your user name.
Generate simple report as doc file, create jupyter notebook and send mail
The final example shows a small python program which collect data from various sources like Github. The program below will collect Github information about given account:
class Github:
def getStars(self):
for repo in g.get_user('Softhints').get_repos():
print(repo)
stars = repo.stargazers_count
forks = repo.forks_count
print(stars)
print(forks)
print(repo.get_topics())
open_issues = repo.get_issues(state='open')
for issue in open_issues:
print(issue)
return stars, forks
I have similar small classes for different API-s and sites like Youtube, Facebook, twitter etc. They collect information which is interesting for me on weekly basis.
Another really useful tool is Selenium who can help with web automation. Below you can find basic automation for Google search:
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
# options.add_argument('--headless')
executable_path = r'/home/userchromedriver_linux64/chromedriver'
options = Options()
driver = webdriver.Chrome( executable_path=executable_path, chrome_options=options)
driver.get("https://google.com")
search = driver.find_element_by_name('q')
search.send_keys("softhints python automation")
print(driver.page_source.encode("utf-8"))
driver.quit()
For this code you need to:
- get latest chrome driver from: ChromeDriver - WebDriver for Chrome
- update the script with the latest path
- find the element on the web page
- on this step you can use really good browser extension - Katalon - Simplify API, Web, Mobile
Automation Tests
- on this step you can use really good browser extension - Katalon - Simplify API, Web, Mobile
- sent information for the search
- get the page source
The final data is collected into csv file stored on the computer, loaded into Jupyter Notebook and finally small report is sent to me. If I find the report interesting I can go into the notebook for more details.
The sample code for the script is shown below:
#!/bin/bash
cd /home/user/environments/venv36
source ./bin/activate
cd /home/user/PycharmProjects/automation/src/
python weekly.py
python create_report.py
python create_notebook.py
python send mail.py
Different sections of the example code can be found below:
Python how to create doc files
from docx import Document
from docx.shared import Inches
document = Document()
document.add_heading('Document Title', 0)
p = document.add_paragraph('A plain paragraph having some ')
p.add_run('bold').bold = True
document.add_heading('Heading, level 1', level=1)
document.add_paragraph('Intense quote', style='Intense Quote')
document.add_paragraph(
'first item in unordered list', style='List Bullet'
)
document.add_paragraph(
'first item in ordered list', style='List Number'
)
document.save('demo.docx')
more info here:
Python library for creating and updating Microsoft Word (.docx) files.
Python how to create jupyter notebook
import nbformat as nbf
nb = nbf.v4.new_notebook()
text = """\
# Weekly report
Report for github, youtube, etc."""
code = """\
# Import pandas
import pandas as pd
# reading csv file
pd.read_csv("report.csv") """
nb['cells'] = [nbf.v4.new_markdown_cell(text),
nbf.v4.new_code_cell(code)]
fname = 'report.ipynb'
with open(fname, 'w') as f:
nbf.write(nb, f)
Notebook can be created in several different ways. I have two methods the first one (the one above use nbformat) the second one use json templates and create notebooks using the ipython API.
Python how to send emails
# Import smtplib for the actual sending function
import smtplib
# Import the email modules we'll need
from email.message import EmailMessage
# Open the plain text file whose name is in textfile for reading.
with open(textfile) as fp:
# Create a text/plain message
msg = EmailMessage()
msg.set_content(fp.read())
# me == the sender's email address
# you == the recipient's email address
msg['Subject'] = 'The contents of %s' % textfile
msg['From'] = me
msg['To'] = you
# Send the message via our own SMTP server.
s = smtplib.SMTP('localhost')
s.send_message(msg)
s.quit()
more info here: Python email: Examples
Conclusion: Automation can be your best friend or your nightmare - all depends on you. Automation may increase dependency and complexity in general but it can save time and energy.
My personal goal is to automate anything which can be automated and it's going to be repetitive in time. Invest 1 day today and save 1 month in future. This rule helped me to do more work for less time. Of course there is a risk of being late with a given task but later this investment will pay off.