Chris's Wiki :: blog/programming/UsePATH Commentshttps://utcc.utoronto.ca/~cks/space/blog/programming/UsePATH?atomcommentsDWiki2009-07-15T20:54:25ZRecent comments in Chris's Wiki :: blog/programming/UsePATH.By Chris Siebenmann on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:98ebe67ca1939e46dbcc02c9d9027fb94a858462Chris Siebenmann<div class="wikitext"><p>Yes, if you're worried about operating in a hostile environment you need
to control the paths of what you run (among a lot of other things). But you
can do that just as well by setting <code>$PATH</code> in your script as by using
absolute paths for everything.</p>
</div>2009-07-15T20:54:25ZFrom 212.24.143.70 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:1545f7e60fac3e9c0d34ec697fc1425753a0e278From 212.24.143.70<div class="wikitext"><p>not using absolute paths could be security risk too. because when someno misses setting up correctly sudo for example (path), then the user who could use it with sudo, could inject his own commands.</p>
<p>i think that every script should have configuration part, where commands are binded to some variables. and the instalation script for it (make, packager, etc) should be responsible to set correct commands to these variables. this could solve linux and solaris command differences too.</p>
<p>-jhr.</p>
</div>2009-07-15T07:24:25ZFrom 82.95.233.55 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:2d3bd40a0f652083082862a37bf3cd693d70e948From 82.95.233.55<div class="wikitext"><p>If you're writing scripts (shell, perl, whatever) you should also be checking the return values of all the actions in the script and trapping errors as needed.</p>
<p>One must never trust anything. Hard disks fill, cd'ing to a different directory fails, files 'disappear'. I prefer writing perl scripts because it makes it easy to check if something has worked or not, like chdir("/path/to/dir") or die "cannot chdir("/path/to/dir"): $!; where $! holds the error that happened.</p>
<p>If things really are important, then you need to check that the script you wrote executed correctly with nagios, for instance. If it doesn't you get an alert.</p>
<p>Writing scripts basically means being paranoid and assuming that anything will go wrong :-)</p>
</div>2009-07-14T19:34:39ZFrom 66.31.100.198 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:6b4e99cb6eeb75279303e9d52de40dad21eb60cfFrom 66.31.100.198<div class="wikitext"><p>This is the stuff that scares me as a SysAdmin. The idea that someone's writing scripts that don't call specific binaries but rely on paths and, what might be described as "chance".</p>
<p>I got burned once too many times and resort to hard wiriing all binaries at the top of the script. I use simple variables as in the previous comments "$LSOF" and reuse that through out the script. This supports the "define once, use many" approach to programming to keep from screwing things up unnecessarily.</p>
<p>A friend suggested to me that if I did my job right I should only have to define PATH within my script and that'd assure all paths are correct. I duct-taped his mouse for this and glued down the phone. But then I considered this as an alternative approach but I've not hammered on the idea enough to find out what the soft spots might be.</p>
<p>Until then, I will continue to use my "old school" approach of defining everything.</p>
</div>2009-07-13T19:33:33ZBy Dan.Astoorian on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:80e3384538f88cdd93c0598e974ad53cf37a3feeDan.Astoorian<div class="wikitext"><blockquote><p>If your script always insures that important directories are in its <code>$PATH</code> (either by adding them at the front or end, or just by setting <code>$PATH</code> completely), you are fine.</p>
</blockquote>
<p>There are cases where this is problematic; e.g., a shell script which is a wrapper for a user application, and the user expects the application and its children to have the path that was there when it was fired up.</p>
<blockquote><p>If you have a Unix vendor that has both a <code>/usr/bin/PROG</code> <em>and</em> a <code>/usr/sbin/PROG</code> (or the like) and they're different, my opinion is that you have more problems than just broken scripts. Really, you need a new vendor, one that's sane.</p>
</blockquote>
<p>What about a vendor that has both a <code>/usr/bin/ps</code> and a <code>/usr/ucb/ps</code>?</p>
<p>Personally, I've had more scripts break because they got moved to a machine that had an incompatible version of a program (sometimes in /usr/local/bin/, perhaps because the native OS doesn't provide that program) than because the machine had a compatible version at a different path.</p>
<p>I've found that the case where the path is missing is usually a lot easier to troubleshoot than the case where the programmer (<em>i.e.</em>, I) assumed that <code>awk</code> is always <code>awk</code>. At least when I change the path to one that does exist and the script still doesn't work, I'm considerably less surprised.</p>
<p>I've also seen scripts work for me but break when my users ran them because the user had <code>$HOME/bin</code> at the beginning of <code>$PATH</code>, because they use a different version of some program that the script called; so if you're going to rely on things being in <code>$PATH</code>, you'd better be keeping a tight leash on what it contains.</p>
<p>--Dan</p>
</div>2009-07-13T18:34:11ZFrom 97.65.201.233 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:7ae3bc6525e305a171c84c1d1c4ae82a990e87e8From 97.65.201.233<div class="wikitext"><p>If you upgrade your distro or are moving a script to a new box, it's fairly safe/sane to assume that scripts are going to break, whether because a file has moved to a new path location or otherwise. For pure sanity purposes I write with hard-coded paths. That way I know I'm using exactly the program and version I intended to use rather than whatever might be hidden away somewhere in a PATH statement. It's not been entirely unusual for me to log in to a box and find that it happens to have the same program installed twice, different versions, one in /bin and one in /usr/local/bin for example.</p>
<p>I don't see a solid argument against hard-coded paths here, to be honest, so I'll stick with hard-coding paths and just deal with the breakages when they occur.</p>
</div>2009-07-13T16:27:23ZBy Chris Siebenmann on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:2a80e3110029b105b5418cca6eeafec9f90a1521Chris Siebenmann<div class="wikitext"><blockquote><p>If you don't use absolute paths, you're at the mercy of your environmental
variables being consistent across sessions.</p>
</blockquote>
<p>This is only true if your script does not set (or augment) <code>$PATH</code> itself.
If your script always insures that important directories are in its
<code>$PATH</code> (either by adding them at the front or end, or just by setting
<code>$PATH</code> completely), you are fine.</p>
<blockquote><p>What if two same-name scripts are on the path ? The undesired one may
get executed.</p>
</blockquote>
<p>If you have a Unix vendor that has both a <code>/usr/bin/PROG</code> and a
<code>/usr/sbin/PROG</code> (or the like) <em>and</em> they're different, my opinion is
that you have more problems than just broken scripts. Really, you need
a new vendor, one that's sane.</p>
<p>(This is where I admit that Fedora and RHEL are sort of this way, which
is why you want to have <code>/sbin</code> on the path before <code>/usr/bin</code>.)</p>
<p>As for checking for whether or not lsof is present before trying to run
it: that doesn't make your script work, it just changes the error messages
that cron (or whatever) will email you.</p>
</div>2009-07-13T15:11:31ZFrom 64.193.21.117 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:89966d6afb9e0a1200b9c03fca7a95a24405054cFrom 64.193.21.117<div class="wikitext"><p>Rather than abandoning explicit paths, would it not be better to use:</p>
<p>$LSOF='/usr/sbin/lsof';if ( -e $LSOF ) then 'run command' else echo "script failed: lsof not in the right place";fi</p>
<p>(sorry if my syntax is off, it's Monday and I've been gone a week)</p>
<p>-Rick Buford</p>
</div>2009-07-13T13:39:28ZFrom 173.70.22.84 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:96ac9d8820d95dd5d74c5e819acfb9a0e6643ab4From 173.70.22.84<div class="wikitext"><p>I've really got to side with the explicit path declarations. Yes, in a heterogeneous environment, the paths can change between machines, but if you're going to be running the same shell script in different environments, your scripts can be bullet-proofed, either by making executables into variables, then defining their location at the top of the script based on an OS check (recommended) or by maintaining multiple shell scripts for each OS build (not recommended). </p>
<p>If you don't use absolute paths, you're at the mercy of your environmental variables being consistent across sessions. It doesn't take very many "ssh foomachine /home/user/script.sh" errors or cron jobs (which don't always include the same PATH variable you know and love) failing before you start to explicitly state your paths. </p>
<p>Heck, I wrote a script this weekend where I got so sick of not having my environmental variables in place that I just included /etc/profile. Oracle was involved, so hopefully I can be forgiven. </p>
<p>Matt Simmons<br>
<a href="http://www.standalone-sysadmin.com">http://www.standalone-sysadmin.com</a></p>
</div>2009-07-13T10:40:53ZFrom 75.152.155.177 on /blog/programming/UsePATHtag:CSpace:blog/programming/UsePATH:f47f989f33c969091ac0eaf1ae215f5a070c0587From 75.152.155.177<div class="wikitext"><p>What if two same-name scripts are on the path ? The undesired one may get executed.</p>
</div>2009-07-13T06:17:40Z