Posted by admin on July 20, 2008 under Tech Tips |
If you are writing your latest and greatest Bash shell script that requires careful user input, then you are probably looking for a way to validate or sanitize the input before using the data in commands or subroutines. Here’s an example shell script that reads user input into a variable, which we in turn echo and sanitize into a new variable. The new variable will then be used to perform whatever function is required, in this case displaying the new value.
#!/bin/bash
read -p "Enter variable: " VAR_INPUT
# Sanitize input and assign to new variable
export VAR_CLEAN="`echo "${VAR_INPUT}" | tr -cd '[:alnum:] [:space:]'`"
echo "New Variable: ${VAR_CLEAN}"
|
Notice, we use the tr command to delete everything except alphanumeric and space characters. You can also perform further manipulation with any other command that comes to mind. For example, if you would like to also limit the number of characters to 10, use the cut command.
export VAR_CLEAN="`echo "${VAR_INPUT}" | tr -cd '[:alnum:] [:space:]' | cut -c -10`"
|
I like using tr in this fashion, because instead of trying to exclude specific characters, you have the option to enforce a deny all policy, making it easier for you to allow only what you want.
As one of our readers mentioned, there is an even simpler method using only Bash search and replace! This eliminates the need for the execution of tr. In the following example, we sanitize the input allowing for only alphanumeric characters and spaces. I also show how to trim the string length to a maximum character limit of 10.
#!/bin/bash
read -p "Enter variable: " VAR_INPUT
# Sanitize input and assign to new variable
export VAR_CLEAN_1="${VAR_INPUT//[^a-zA-Z0-9 ]/}"
echo "New Variable 1: ${VAR_CLEAN_1}"
# Sanitize input, assign to new variable but limit it to 10 characters
export VAR_CLEAN_2="`echo "${VAR_INPUT//[^a-zA-Z0-9 ]/}" | cut -c -10`"
echo "New Variable 2: ${VAR_CLEAN_2}"
|
For more information, be sure to check out the man pages for tr and take a look at the Advanced Bash-Scripting Guide. Additional comments and ideas welcome!
Posted by admin on March 6, 2008 under Tech Tips |
If you are working on a project requiring that you generate hundreds or thousands of similar but unique text files, you probably will be looking for some type of task automation process. For example, a recent project required that we configure hundreds of Cisco switch configuration files, each being uniquely individualized with IP addresses, OSPF configurations, random passwords, etc. We also needed to configure an import data file for over a thousand DHCP ranges, obviously not something we felt like doing by hand.
For tasks like these, the best tools of the trade are text file templates that represent what the finished product should look like, spreadsheets for basic math and to create import data in the form of CSV’s (comma-separated values), and GNU/Linux shell scripting to pull it all together.
The process can be summarized as follows:
1. Use a template text file and identify the parts you wish to customize with easy to recognize variables.
2. Generate unique data in the CSV format that will serve to populate the template.
3. Run a Linux shell script that performs a search and replace function against the template, and outputs new text files with the values obtained from the CSV file.
In the following example, we will work with a basic Cisco switch configuration file that is condensed for brevity. Notice, we clearly label text within the template that we will use to search for and replace with values obtained from the CSV data. These variables are VAR_HOSTNAME, VAR_IPADDR, VAR_NETMASK, VAR_OSPF_KEY, VAR_TACACS_KEY
.
Sample Template: import-template.txt
!
hostname VAR_HOSTNAME
!
interface Vlan100
ip address VAR_IPADDR VAR_NETMASK
ip ospf message-digest-key 1 md5 VAR_OSPF_KEY
!
tacacs-server host 1.1.1.1
tacacs-server key VAR_TACACS_KEY
!
The CSV file in our example is organized as follows:
Field 1=”Host name”, Field 2=”IP address”, Field 3=”Subnet Mask”, Field 4=”OSPF key”, and Field 5=”TACACS key”
Sample Data CSV File: import-data.txt
switch-01,10.1.1.1,255.255.255.0,ospf-key-1,tacacs-key-1
switch-02,10.1.1.2,255.255.255.0,ospf-key-2,tacacs-key-2
switch-03,10.1.1.3,255.255.255.0,ospf-key-3,tacacs-key-3
switch-04,10.1.1.4,255.255.255.0,ospf-key-4,tacacs-key-4
switch-05,10.1.1.5,255.255.255.0,ospf-key-5,tacacs-key-5
Here’s how we pull all of this together using a Linux shell script. First, we define the location of our CSV import file. Next we run a “for loop” that reads the CSV line by line, assigning each data value to a variable name. Lastly, the script concatenates the template, replacing each defined search result with values from the CSV, and sends the output to individualized text files. These text files are appropriately named according the unique value, which in this case is the switches host name.
Import Shell Script: import-script.sh
#!/bin/bash
IMPORT="./data/import-data.txt"
TEMPLATE="./data/import-template.txt"
for i in `cat ${IMPORT}`
do
VAR_HOSTNAME=`echo $i | awk -F, '{print $1}'`
VAR_IPADDR=`echo $i | awk -F, '{print $2}'`
VAR_NETMASK=`echo $i | awk -F, '{print $3}'`
VAR_OSPF_KEY=`echo $i | awk -F, '{print $4}'`
VAR_TACACS_KEY=`echo $i | awk -F, '{print $5}'`
cat $TEMPLATE | sed -e s/VAR_HOSTNAME/$VAR_HOSTNAME/g
-e s/VAR_IPADDR/$VAR_IPADDR/g
-e s/VAR_NETMASK/$VAR_NETMASK/g
-e s/VAR_OSPF_KEY/$VAR_OSPF_KEY/g
-e s/VAR_TACACS_KEY/$VAR_TACACS_KEY/g
| tee ./output/$VAR_HOSTNAME.txt 1>/dev/null
done
|
Step By Step:
To follow along, you can download the above mentioned files, or copy and paste them from the text above. Place the import-data.txt and import-template.txt files in a new directory called “data”, create a directory called “output”, and make the import-script.sh file executable. The final step of course is to run the import script. Here is a step by step to get you going.
wget https://savvyadmin.com/download/import-script/import-script.sh
wget https://savvyadmin.com/download/import-script/import-data.txt
wget https://savvyadmin.com/download/import-script/import-template.txt
mkdir data output
mv ./import-template.txt ./import-data.txt ./data/
chmod 755 ./import-script.sh
./import-script.sh
After you run the script successfully, list the “output” directory to see the newly created files.
ls -l output/
total 20
-rw-r--r-- 1 gmendoza gmendoza 184 2008-03-06 21:33 switch-01.txt
-rw-r--r-- 1 gmendoza gmendoza 184 2008-03-06 21:33 switch-02.txt
-rw-r--r-- 1 gmendoza gmendoza 184 2008-03-06 21:33 switch-03.txt
-rw-r--r-- 1 gmendoza gmendoza 184 2008-03-06 21:33 switch-04.txt
-rw-r--r-- 1 gmendoza gmendoza 184 2008-03-06 21:33 switch-05.txt
The contents of the first file in the example can be viewed as follows. Note that all the variables from the template have been replaced appropriately by the data from the CSV.
cat output/switch-01.txt
!
hostname switch-01
!
interface Vlan100
ip address 10.1.1.1 255.255.255.0
ip ospf message-digest-key 1 md5 ospf-key-1
!
tacacs-server host 1.1.1.1
tacacs-server key tacacs-key-1
!
I certainly hope this gives you some great ideas on how to make effective use of Linux scripts to make large amounts of work into manageable tasks. Comments and questions are welcome.