Difference between revisions of "Twitter recording status"

From MythTV Official Wiki
Jump to: navigation, search
Line 126: Line 126:
 
}
 
}
 
</pre>
 
</pre>
 +
 +
= Python Implementation =
 +
 +
This is a script to tweet recordings currently taking place on the local backend. There are other scripts I have seen but all tweeted when watching LiveTV; this script only tweets actual recordings.
 +
 +
I personally use it to tweet the recordings which in turn get pushed to my iPhone so I know a recording has started.
 +
 +
The variables are given in the script.
 +
Change them, chmod +x it and run.
 +
 +
PLEASE NOTE:
 +
Twitter has limits to how many api calls can be made in an hour (150 I think). If this limit is reached the script will write to the log file and pause for 30 mins.
 +
 +
REQUIRES:
 +
python, python-twitter
 +
(Ubuntu: sudo apt-get install python python-twitter)
 +
 +
{{Python|mythtv_twitter_monitor.py|
 +
<pre>
 +
#!/usr/bin/python
 +
 +
###########################################################################
 +
#
 +
# This script will monitor mythtv for recordings and will tweet them.
 +
# It detects recordings only. When watching liveTV it is not tweeted.
 +
# Run on the mythtv backend.
 +
#
 +
# Christopher Kemp - chris.kemp05@gmail.com
 +
# A lot of help from StephenF via http://ubuntuforums.org
 +
#
 +
# Written 2010
 +
#
 +
###########################################################################
 +
 +
import httplib
 +
import twitter
 +
import time
 +
import os
 +
import logging
 +
from MythTV import MythDB, MythBE
 +
 +
###########################################################################
 +
############### Variables ################
 +
###########################################################################
 +
 +
#Twitter Settings
 +
twlogin = 'user@domain.com'
 +
twpasswd = 'password'
 +
 +
#Mythtv SQL database login
 +
dbuser = 'mythtv'
 +
dbpass = 'password'
 +
 +
#Location of the log file (Must be writeable by the user running this script)
 +
log_file = '/path/to/logfile'
 +
 +
###########################################################################
 +
############# End Variables ##############
 +
###########################################################################
 +
 +
#Enable the log
 +
logging.basicConfig(filename=log_file,level=logging.DEBUG)
 +
 +
#Get the hostname
 +
behostname = os.uname()[1]
 +
 +
class Encoder(object):
 +
    """Class to analyse MythTV encoder activity."""
 +
 +
    def is_recording(self, mythdata):
 +
        return self.match in mythdata
 +
 +
    def current_recording(self, mythdata):
 +
        """Returns whatever the encoder is currently recording."""
 +
 +
        if self.is_recording(mythdata):
 +
            return str(self.back_end.getCurrentRecording(self.id))[10:-37]
 +
        else:
 +
            return None
 +
 +
    def new_recording(self, mythdata):
 +
        """Returns None for all but new recordings."""
 +
 +
        rec = self.current_recording(mythdata)
 +
        if rec == self.old_rec:
 +
            return None
 +
        else:
 +
            self.old_rec = rec
 +
            return rec
 +
 +
    def __init__(self, back_end, id):
 +
        self.back_end = back_end
 +
        self.id = id
 +
        self.match = 'Encoder %d is local on %s and is recording' % (id,behostname)
 +
        self.old_rec = None
 +
 +
 +
def main():
 +
    # Connect to Twitter and MythTV backend.
 +
    twitter_h = twitter.Api(username=twlogin, password=twpasswd)
 +
    myth_be = MythBE(db=MythDB(args=(('DBHostName','localhost'),
 +
        ('DBName','mythconverg'), ('DBUserName','mythtv'),
 +
        ('DBPassword',dbpass))))
 +
 +
    # Make a list of Encoder objects for encoders 1 and 2.
 +
    encoders = [Encoder(myth_be, x) for x in xrange(1, 3)]
 +
 +
    while 1:
 +
        # Obtain MythTV status info.
 +
        conn = httplib.HTTPConnection('localhost:6544')
 +
        conn.request('GET', '/')
 +
        mythdata = conn.getresponse().read()
 +
 +
        # Check each encoder in turn.
 +
        for enc in encoders:
 +
            show = enc.new_recording(mythdata)
 +
            if show is not None:
 +
                #See if twitter can be reached
 +
                try:
 +
                    #If twitter can be reached then tweet show
 +
                    twitter_h.PostUpdate('Recording: %s' % show)
 +
                    time.sleep(120)
 +
                except:
 +
                    #What to do if twitter throws an error. Usually exceeding the api call limit.
 +
                    #Limits are reset after an hour so sleep for half an hour
 +
                    logging.warn('Couldnt log in to twitter. Has probably reached the api limit call. The script will sleep for 30 mins now and then carry on as normal.')
 +
                    time.sleep(1800)
 +
 +
 +
if __name__ == '__main__':
 +
    main()
 +
</pre>
 +
}}
 +
 +
[[Category:HOWTO]]
 +
[[Category:Scripts]]

Revision as of 16:55, 6 June 2010

It's fairly simple to set up a user job to send out recording status updates via Twitter. Paste the following code into a file (for example, twitter.pl)

#!/usr/bin/perl
use LWP::UserAgent;
my $output = shift @ARGV;
 
my $browser = LWP::UserAgent->new;
my $url = 'http://twitter.com/statuses/update.json';
$browser->credentials('twitter.com:80', 'Twitter API', 'username', 'password');
$response = $browser->get("http://twitter.com/account/verify_credentials.json");
my $response = $browser->post($url, {status => $output});

Edit the username and password to match your twitter user and password. Save the file, make it executable, and put it somewhere in your path. In this example we'll put the file in /usr/bin.

chmod +x twitter.pl
cp twitter.pl /usr/bin/

Stop your backend and run mythtv-setup. In step 1, General, you must adjust two options. First, you must allow the new user job to be run on this backend. For example, if your new User Job is the first one, tick the "Allow User Job #1 on this backend." On the User Job setup page, give your job a name, such as "Post-record Twitter." Then you can use something like the following command line:

/usr/bin/twitter.pl "Finished recording %TITLE% (%SUBTITLE%) on %CHANID% at %ENDTIMEISO%.  Backend was %HOSTNAME%."

You can insert information as you wish, using any of the variables from User Jobs. Complete the setup and you will now have a user job which you can set to run at the end of individual or all recording rules. you can now edit your recording rules and set the user job to run at the end of each recording to update your twitter status.

NOTE: Some people regard Twitter as a handy news resource and search on tags based on things they like to keep up on - say MythTV for example. Inserting #MythTV in your userjob twitter script makes all your recording tweets appear in the search feed meaning that people who genuinely want to follow real news items have to add the username your script uses to their filter. As of 11th November 2009 the filter string is quite short but as time goes by it'll get longer. PLEASE choose a different tag, or no tag at all. The majority of people wanting to follow #MythTV items are probably not interested in what you record or watch ;-)

A more complete Twitter user job script is possible which tweets the actual channel name: Save the script as twitter.pl, mark it as executable as before & set the user job to run as
 twitter.pl starttime=%STARTTIME% chanid=%CHANID%


#!/usr/bin/perl
use LWP::UserAgent;
use DBI;
use DBD::mysql;
use MythTV;

$connect = undef;
$debug = 0;
$title="";
$subtitle="";
$newsubtitle="";
$starttime="";
$chanid="";

##################################
#                                #
#    Main code starts here !!    #
#                                #
##################################

$usage = "\nHow to use twitter.pl \n\ twitter.pl starttime=%STARTTIME% chanid=%CHANID% debug\n"
        ."\n%CHANID% = channel ID associated with the recording\n"
        ."%STARTTIME% = recording start time in either 'yyyy-mm-dd hh:mm:ss' or 'yyyymmddhhmmss' format\n"
        ."debug = enable debugging information - outputs which commands would be run etc\n";

# get this script's ARGS
#

$num = $#ARGV + 1;

# if user hasn't passed enough arguments, die and print the usage info

if ($num le "1") {
        die "$usage";
}

#
# Get all the arguments
#

foreach (@ARGV){
    if ($_ =~ m/debug/) {
        $debug = 1;
    }
    elsif ($_ =~ m/starttime/) {
        $starttime = (split(/\=/,$_))[1];
    }
    elsif ($_ =~ m/chanid/) {
        $chanid = (split(/\=/,$_))[1];
    }
}

# connect to backend
my $myth = new MythTV();
# connect to database
$connect = $myth->{'dbh'};

$query = "SELECT name FROM channel WHERE chanid=$chanid";
$query_handle = $connect->prepare($query);
$query_handle->execute()  || die "Unable to query channel table";

my ($channame) = $query_handle->fetchrow_array;

$query = "SELECT title, subtitle, endtime FROM recorded WHERE chanid=$chanid and starttime='$starttime'";
$query_handle = $connect->prepare($query);
$query_handle->execute()  || die "Unable to query settings table";

$query_handle->bind_columns(undef, \$title, \$subtitle, \$endtime);
$query_handle->fetch();

if ($subtitle)
{
$newsubtitle = " - ".$subtitle;
}

$output = "Finished recording $title $newsubtitle from $channame at $endtime";
    print "Chanid $chanid \n";
    print "Starttime $starttime \n";
    print "$output \n";

if ($debug)
{
}
else
{
my $browser = LWP::UserAgent->new;
my $url = 'http://twitter.com/statuses/update.json';
$browser->credentials('twitter.com:80', 'Twitter API', 'username', 'password');
$response = $browser->get("http://twitter.com/account/verify_credentials.json");
my $response = $browser->post($url, {status => $output});
}

Python Implementation

This is a script to tweet recordings currently taking place on the local backend. There are other scripts I have seen but all tweeted when watching LiveTV; this script only tweets actual recordings.

I personally use it to tweet the recordings which in turn get pushed to my iPhone so I know a recording has started.

The variables are given in the script. Change them, chmod +x it and run.

PLEASE NOTE: Twitter has limits to how many api calls can be made in an hour (150 I think). If this limit is reached the script will write to the log file and pause for 30 mins.

REQUIRES: python, python-twitter (Ubuntu: sudo apt-get install python python-twitter)


PythonIcon.png mythtv_twitter_monitor.py

#!/usr/bin/python

###########################################################################
#
# This script will monitor mythtv for recordings and will tweet them.
# It detects recordings only. When watching liveTV it is not tweeted.
# Run on the mythtv backend.
#
# Christopher Kemp - chris.kemp05@gmail.com
# A lot of help from StephenF via http://ubuntuforums.org
#
# Written 2010
#
###########################################################################

import httplib
import twitter
import time
import os
import logging
from MythTV import MythDB, MythBE

###########################################################################
############### Variables ################
###########################################################################

#Twitter Settings
twlogin = 'user@domain.com'
twpasswd = 'password'

#Mythtv SQL database login
dbuser = 'mythtv'
dbpass = 'password'

#Location of the log file (Must be writeable by the user running this script)
log_file = '/path/to/logfile'

###########################################################################
############# End Variables ##############
###########################################################################

#Enable the log
logging.basicConfig(filename=log_file,level=logging.DEBUG)

#Get the hostname
behostname = os.uname()[1]

class Encoder(object):
    """Class to analyse MythTV encoder activity."""

    def is_recording(self, mythdata):
        return self.match in mythdata

    def current_recording(self, mythdata):
        """Returns whatever the encoder is currently recording."""

        if self.is_recording(mythdata):
            return str(self.back_end.getCurrentRecording(self.id))[10:-37]
        else:
            return None

    def new_recording(self, mythdata):
        """Returns None for all but new recordings."""

        rec = self.current_recording(mythdata)
        if rec == self.old_rec:
            return None
        else:
            self.old_rec = rec
            return rec

    def __init__(self, back_end, id):
        self.back_end = back_end
        self.id = id
        self.match = 'Encoder %d is local on %s and is recording' % (id,behostname)
        self.old_rec = None


def main():
    # Connect to Twitter and MythTV backend.
    twitter_h = twitter.Api(username=twlogin, password=twpasswd)
    myth_be = MythBE(db=MythDB(args=(('DBHostName','localhost'),
         ('DBName','mythconverg'), ('DBUserName','mythtv'),
         ('DBPassword',dbpass))))

    # Make a list of Encoder objects for encoders 1 and 2.
    encoders = [Encoder(myth_be, x) for x in xrange(1, 3)]

    while 1:
        # Obtain MythTV status info.
        conn = httplib.HTTPConnection('localhost:6544')
        conn.request('GET', '/')
        mythdata = conn.getresponse().read()

        # Check each encoder in turn.
        for enc in encoders:
            show = enc.new_recording(mythdata)
            if show is not None:
                #See if twitter can be reached
                try:
                    #If twitter can be reached then tweet show
                    twitter_h.PostUpdate('Recording: %s' % show)
                    time.sleep(120)
                except:
                    #What to do if twitter throws an error. Usually exceeding the api call limit.
                    #Limits are reset after an hour so sleep for half an hour
                    logging.warn('Couldnt log in to twitter. Has probably reached the api limit call. The script will sleep for 30 mins now and then carry on as normal.')
                    time.sleep(1800)


if __name__ == '__main__':
    main()