You are not logged in.
Not sure where to put this one, but it involves perl scripts, so the more technical "system configuration" seems appropriate.
I like having plenty of information available on my desktop, especially about the machine itself. So I always use GKrellM if at all possible. One of the plugins that is nice to have, though not essential, is the weather info plugin. This plugin has had problems for years now because the original author disappeared and the NOAA weather site keeps shifting information around. When it first broke, I just ignored it for a while and then managed to figure out that the "program" that gathered the weather info for the plugin was just a script. A simple editing job fixed it after finding where the information had moved to.
But now that I am trying to get my Linux desktop up-to-date I find that it has broken again. This time they have added https to the URL and apparently the old standard http doesn't work anymore. There is a so-called replacement utility out there, but it doesn't gather all of the information. Instead of using that, I edited the script again. It might be working? I'm not sure since I haven't had the chance to just let it sit and run for a few hours. When I manually run the script, it does retrieve the information. So here's the question: How does perl get the information? Is it invoking a browser in the background? Does it matter what browsers are present on my system? Or will perl be able to do this https connection with its own libraries regardless? Here's the actual script, BTW. Maybe someone with actual knowledge of perl can help. I have been blindly guessing how to change things. Notice the mention of LWP. This is apparently a perl library that does HTTP connections? It is present in my Devuan install, so that is probably what the script is using to connect.
#!/usr/bin/perl
#
# Grabs the latest local weather conditions from the
# National Weather Service (NWS). Uses the decoded METAR reports.
#
# Need to give the 4-character METAR station code on the
# Command Line. E.g.;
#
# GrabWeather YSSY
#
$ReportDir = '.wmWeatherReports';
$WeatherSrc = 'https://tgftp.nws.noaa.gov/data/observations/metar/decoded';
use strict;
use vars qw( $ReportDir $WeatherSrc );
use IO::File;
#
# Change to users home directory. We used to dump into /tmp
# but using home dir instead avoids multiple users interfering
# with one another. (Yeah, we could "unique-ize" the filenames, but
# this is easier for now...)
#
my $home = $ENV{HOME} || (getpwuid($<))[7];
chdir() || chdir($home) or die "chdir '$home' failed: $!";
unless(-e $ReportDir) {
mkdir $ReportDir, 0755 or die "unable to mkdir '$ReportDir': $!";
}
chdir $ReportDir or die "chdir '$ReportDir' failed: $!";
my $StationID = uc shift @ARGV or die "Usage: $0 <station-id>\n";
my $HTMLFileName = "$StationID.TXT";
my $URL = "$WeatherSrc/$HTMLFileName";
my $DataFileName = "$StationID.dat";
# Is LWP installed?
eval { require LWP::UserAgent };
if ($@) {
my $cmd = qq{wget --cache=off --passive-ftp --tries=0 --quiet } .
qq{--output-document=$home/$ReportDir/$HTMLFileName $URL};
`$cmd` == 0 or die "unable to fetch weather: $?";
} else {
$ENV{FTP_PASSIVE} = 1; # LWP uses Net::FTP internally.
my $ua = new LWP::UserAgent (agent => 'Mozilla/5.0', cookie_jar =>{});
$ua->env_proxy();
my $req = new HTTP::Request( GET => $URL );
my $rsp = $ua->request( $req );
die $rsp->status_line unless $rsp->is_success;
my $fh = new IO::File "> $home/$ReportDir/$HTMLFileName"
or die "unable to write '$home/$ReportDir/$HTMLFileName': $!";
print $fh $rsp->content;
close $fh or die "error closing '$home/$ReportDir/$HTMLFileName': $!";
}
#
# Parse HTML File.
#
my %stats = (
temp => -99.0,
chill => -99.0,
dew_point => -99.0,
pressure => -99.0,
humidity => -99.0,
universal_time => '99:99',
);
my $fh = new IO::File $HTMLFileName
or die "unable to read '$HTMLFileName': $!";
chomp($stats{station_info} = <$fh>);
chomp($stats{update_time} = <$fh>);
while (<$fh>){
chomp;
$stats{sky_conditions} = $1, next if /Sky conditions: (.*)/;
$stats{temp} = $1, next if /Temperature:\s*(\-{0,1}[0-9.]{1,}).*/;
$stats{chill} = $1, next if /Windchill:\s*(\-{0,1}[0-9.]{1,}).*/;
$stats{dew_point} = $1, next if /Dew Point:\s*(\-{0,1}[0-9.]{1,}).*/;
$stats{pressure} = $1, next if /Pressure \(.*\):\s*([0-9.]{2,}).*/;
$stats{humidity} = $1, next if /Relative Humidity:\s*(\d{1,})\%.*/;
$stats{coded_metar} = $1, next if /ob: (.*)/;
}
close $fh or die "error closing '$HTMLFileName': $!";
#
# Isolate the Wind groups out of the coded METAR report.
# There may be two groups - the normal one and a variability set.
#
$stats{wind_group} = $stats{coded_metar};
$stats{wind_group} =~ s/ RMK\s.*$//;
$stats{var_flag} = 1 if $stats{wind_group} =~ /\d+(KT|MPS)\s\d+V\d+\s/;
if ($stats{wind_group} =~ /\s(\w{3})(?:(\d+)G)?(\d+)(KT|MPS)\s/) {
@stats{qw( direction speed1 speed2 )} = ($1, $2, $3);
if ($4 eq 'MPS') {
$stats{speed1} *= 1.942 if defined $stats{speed1};
$stats{speed2} *= 1.942;
}
}
#
# Get the Time out of the coded Metar Report.
#
if ($stats{coded_metar} =~ /$StationID \d+?(\d{2})(\d{2})Z/) {
$stats{universal_time} = "$1:$2";
}
#
# Write out the stuff we need to the Data File. This is the file that will
# be read by GKrellWeather.
#
my $fh = new IO::File ">$DataFileName"
or die "unable to write '$DataFileName': $!";
print $fh
map { "$stats{$_}\n" }
qw( station_info update_time sky_conditions universal_time
temp dew_point chill pressure humidity );
if (not exists $stats{direction}) {
print $fh "-99\n";
} elsif ($stats{direction} =~ /VRB/) {
print $fh "99\n";
} elsif ($stats{var_flag}) {
print $fh $stats{direction} * -1, "\n";
} else {
print $fh $stats{direction} + 0, "\n";
}
if (not $stats{direction}) {
print $fh "-99\n";
} elsif (defined $stats{speed1} and defined $stats{speed2}) {
my $ave_speed = (($stats{speed1} + $stats{speed2})/2.0) * 1.15155;
print $fh "-$ave_speed\n";
} else {
print $fh $stats{speed2} * 1.15155, "\n";
}
close $fh or die "error closing '$DataFileName': $!";
Offline
A quick look at the perl suggests that it is using wget to pull over the info, rather than a full browser. This is fairly normal for scripts.
Geoff
Offline
Your modified script works fine here. Good job. You can change the update interval in gkrellm settings. Default is 15 min.
Offline
Well, I'm glad it works but my concern is that I don't understand why it works. Guess I need to learn perl. It's become very influential. Sort of the script equivalent to the C dialects in full languages, real techies need to know the basics even if they don't use it full time.
Offline
It works for you, but I don't see it working for me. Which is strange. It does work when I manually run the script. But GKrellM doesn't seem to be able to run it. Apparently that indicates a permissions problem? I wonder what I would check for?
Offline
Well, it was not a permissions problem. It was a derp problem. The package maintainer did not keep the program config and the installation in sync. Once discovered, it was easily fixed with a manual edit. The GrabWeather script is installed to /usr/share/gkrellm/GrabWeather but the config file tells the program to run it from /usr/local/share/gkrellm/GrabWeather -- resulting in a fail. Despite the bug being reported more than a year ago, it still has not been fixed in the repository. See: https://bugs.debian.org/cgi-bin/bugrepo … bug=895851
Offline
Oh! I did this in jessie, with the version before the one in the bug report. It looks for the script in the right place.
Beowulf has the same version as ascii, and it doesn't work there. I fixed it by making a symlink to the real GrabWeather.
Offline