Replace a string with a new one in all files using sed and xargs
1 2 3 |
oldstring="some_string_to_search" newstring="new_string_to_replace" grep -rl $oldstring /path/dir/ | xargs sed -i s@$oldstring@$newstring@g |
Replace a string with a new one in all files using sed and xargs
1 2 3 |
oldstring="some_string_to_search" newstring="new_string_to_replace" grep -rl $oldstring /path/dir/ | xargs sed -i s@$oldstring@$newstring@g |
Comment specific line using sed command can be used to configuration files …
1 |
sed -i '123 s/^/#/' filename |
Should be used for really huge database imports like tens of GB … Note: remember to navigate to the directory where your .sql file lives so you can use source on it later …
1 2 |
cd path/to/dir/ mysql -u root -p |
Note: if you’re not sure what these options do, please, do some research first.
1 2 3 4 5 6 7 8 9 |
set global net_buffer_length=1000000; --Set network buffer length to a large byte number set global max_allowed_packet=1000000000; --Set maximum allowed packet size to a large byte number SET foreign_key_checks = 0; --Disable foreign key checking to avoid delays,errors and unwanted behaviour source file.sql --Import your sql dump file SET foreign_key_checks = 1; --Remember to enable foreign key checks when procedure is complete! |
Searching inside a tarball file (even gzipped) can be really relieving. You don’t have to extract all the files every time to seek for a file using find command. Search tarball files using grep
1 |
tar -tvf my-data.tar.gz | grep search-pattern |
Search tarball files using its built-in parameter Note: that if you want to use wildcards you need to add –wildcards […]
Copy multiple files via ssh and preserve permissions and ownership. Creating a tarball on the fly and pipe its output through ssh afterwards extracting the files from the tarball will let us speedup the uploading times because the stream of data is continuous through the network and therefore will be done in one single connection. […]
Scan symlinks on web server directory structure recursively to find if any symlinks exist. I used it to discover if a shared web hosting server was compromised/ rooted using the symlink attack. Read about the attack in the link below. Note: Don’t forget to change the path.
1 |
find /home*/*/public_html -type l -exec ls -l {} \; | grep "\-> /$" |
Anatomy of the attack
Note: that there will be a lot false positives when scanning WordPress file structures.
1 2 3 |
grep '((eval.*(base64_decode|gzinflate))|\$[0O]{4,}|(\\x[0-9a-fA-F]{2}){8,}|cgitelnet|webadmin|PHPShell|tryag|r57shell|c99shell|noexecshell|revengans|myshellexec|FilesMan|JGF1dGhfc|document\.write\("\\u00|sh(3(ll|11)))' . -roE --include=*.php* grep '((eval.*(base64_decode|gzinflate))|cgitelnet|webadmin|ircd|PHPShell|tryag|r57shell|c99shell|noexecshell|revengans|myshellexec|FilesMan|JGF1dGhfc)' . -roE --include=*.php* |
Discover world writable files (aka with permission 777) will help you sometimes find out which files you missed to secure from your user’s eyes.
1 |
find / -perm -2 ! -type l -ls |
If you don’t understand Linux file permission bits you should read this
Sorting the first 20 processes by ram in a linux system is often useful when you want to narrow down and find which processes are eating up your server’s RAM
1 |
ps aux | awk '{print $2, $4, $11}' | sort -k2nr | head -n 20 |
Find ulimit -a for other users that don’t have shell access using su command and ulimit attaching a bash shell
1 |
su www-data --shell /bin/bash --command "ulimit -a" |