Exploit-DB updates

Friday, May 20, 2011

Using pipelines to pipe data.

Well this is a simple concept, but it's usefulness has no limits. It's the ability to pipe data output from one command into another. For example, I can use the use the "ip addr" command combined with grep to only display the lines containing the inet addresses.

 ~ $ ip addr | grep inet
    inet 127.0.0.1/8 scope host lo
    inet6 ::1/128 scope host
    inet 10.0.0.3/24 brd 192.168.1.255 scope global eth0
    inet6 fe80::217:31ff:feda:9814/64 scope link
You could pipe that data along further to awk, then select specific data out of it to be displayed. For example;

 ~ $ ip addr | grep inet | awk '{print $2 ":" $4}'
           
            127.0.0.1/8:host
            ::1/128:host
            192.168.1.102/24:192.168.1.255
            fe80::217:31ff:feda:9814/64:link



Or another example could involve that simple java ip parser I made, while using it I noticed that alot of scans had duplicate ips, which could cause you to waste valuable time running tools against the same IP multiple times. I haven't been playing much with Java lately, so I decided to just fix the problem in a bash script with the use of the "uniq" command. This is the command the script uses to remove duplicate IP's from the IP list.

java -jar ips.jar nlog | uniq > iplist.txt

That would parse the nmap log for IP's and it's then piped over to the uniq command which removes duplicate IP's and writes it to a file called iplist.txt in a list format.

You can find that Java parser here.

No comments:

Post a Comment