Hi -
re. 1, There are options in Excel to connect to a website though you would have to manually do an import.
For (2) a completely automated option, It's possible to webscrape the Racing Post site and you don't have to be a member to access the betting forecasts. However, I haven't seen that they are stored historically (member or otherwise), so this would be on a going forward daily basis only. The website doesn't change often so the real risk of having to change a program that does this is slight, though as with any webscraping you have to check Ts&Cs to see it's legitimate use of the site. Generally if you have access personally and are only scraping data for personal use (as opposed to distribution) you should be fine.
As for how to go about this, if you've never done it before it is a learning curve, but Perl, Python, Ruby and even R are all good choices for webscraping since mostly the principles are:
1. Connect to site and loop/traverse to required pages, looking for required text as you go
2. Use a module in one of the languages to structure the HTML (though you don't have to and can usually do everything by hand so jump straight to step 3 if too complex)
3. Use regular expressions to extract the required text.
4. Store it locally.
You can do in pretty much any language though, including Java and C#.