Back in 2013, I shipped the Social Media Scripting Framework for the first time. I was excited about it, but, at the same time, I realized that there were some things that, clearly, were too complicated. There is still a lot of work to do to make it even more simple and more capable. This is, definitely, not over. Anyway, I would like to spend some time showing you how the new updates have simplified the way you interact with the framework and how to get the most of it.
Working with Excel
The Excel module has changed a great deal in many ways. First, you no longer have to have a local copy of Excel on your local computer in order to work with data stored on Excel files. Of course, you can have it, but the Social Media Scripting Framework doesn't rely on it any more.
Let's see what your experience was and how it is today:
On previous versions …
# Opening an Excel file ... $excel = New-ExcelInstance $book = Open-ExcelFile -instance $excel -file .\SMCampaignControl-demo-201303-0.xlsx $CampaignSheetId = Get-ExcelSheetIdByName -book $book -name 'campaign' $CampaignDataSource = Get-ExcelSheet -book $book -id $CampaignSheetId $CampaignInfoSchema = Get-ExcelHeaders -sheet $CampaignDataSource $campaign = Load-ExcelDataSet -sheet $CampaignDataSource -schema $CampaignInfoSchema # ETC: 15 minutes. Close-ExcelFile -instance $excel -book $book # Saving data to an Excel file ... $excel = New-ExcelInstance $book = Open-ExcelFile -instance $excel -file .\SMCampaignControl-demo-201303-0.xlsx $CampaignSheetId = Get-ExcelSheetIdByName -book $book -name 'campaign' $CampaignDataSource = Get-ExcelSheet -book $book -id $CampaignSheetId $CampaignInfoSchema = Get-ExcelHeaders -sheet $CampaignDataSource Save-ExcelDataSet -DataSet $UpdatedCampaign -sheet $CampaignDataSource -schema $CampaignInfoSchema # 6 mins. Save-ExcelFile -book $book Close-ExcelFile -instance $excel -book $book
And, with the new version …
# Opening an Excel file ... $campaign = Import-ExcelDataSet -file .\SMCampaignControl-demo-2014-0.xlsx -sheet campaign -DataStart 3,2 # Saving data to an Excel file ... Export-ExcelDataSet $UpdatedCampaign -file .\SMCampaignControl-demo-2014-0.xlsx -sheet campaign -DataStart 3,2
Despite being more simple, no previous functionality has been lost. In fact, the new implementation includes the possibility to define the point where your dataset begins.
The Information and Process Cycle …
On previous versions of the framework it was still necessary to transform and/or attach external data to that retrieved from the different APIs. That operation was done during the so called “Normalization process” which, by the way, couldn't be executed until the main information was received from the service we were interrogating. This difference between “Data Acquisition” and “Data Normalization” also made regular information updates more complex than necessary.
There weren't either any means to enforce common data structures for each digital channel. As a result, datasets coming from different digital channels were not compatible among each other … Long story short, a little nightmare composed by multistage processes that had to be performed in specific ways …
But let's try to make it visual. Let's see how the “before” and the “after” look like:
On previous versions …
$TwitterTL = Get-TweetsFromUser cveira -results 250 -quick # Data Acquisition $NormalizedTwitterTL = Normalize-TwitterTimeLine $TwitterTL -IncludeAll # Data Normalization # (Re)Building a "Virtual Time Line" from a set of permalinks. This was the way of "updating" information ... $UpdatedTwitterTL = Rebuild-TwitterTimeLine -from $campaign # Data Acquisition: Timeline with "updated information" $NormalizedTwitterTL = Normalize-TwitterTimeLine $UpdatedTwitterTL -IncludeAll # Data Normalization # Analyze-FBTimeLineProspects can only be used with Facebook data sets ... $FBTimeLine | Analyze-FBTimeLineProspects | sort Likes, Comments -descending | select -first 10 | format-table -autoSize
And, with the new version …
$TwPosts = Get-TwTimeLine -name cveira -results 100 -quick } # Data Acquisition + Data Normalization. All in one shot. $TwUpdatedPosts = Update-TwPosts -from $( $TwPosts | where { ( [datetime] $_.NormalizedPost.PublishingDate ).Year -eq 2014 } ) # Data Update + Data Normalization. All in one shot. $TwPosts.PostConnections | Measure-SMPostProspects # This "Prospect Analysis" function now works for any supported platform
As you can see, now, we just focus on “what we want”. We don't need to think or recall that there is a “Normalization Process” or a “Rebuilding process”. We don't need to figure out “how they work”. We just “declare what we want” using very basic primitives: “Get” and “Update”. That's it.
Notice that we can now define advanced in-line filters to narrow the information we want to update:
$TwPosts | where { ( [datetime] $_.NormalizedPost.PublishingDate ).Year -eq 2014 }
Because now data structures have been standardized, now you can reuse that filter as a pattern that you can apply to other digital channels. In other words, just learn things once, and profit as many times as you use them ;-).
Social Media Campaigns and Social Media Analytics
In a previous post I already discussed the topic of how to run a Social Media campaign with PowerShell. It basically explored ways for building data sets with information coming from different digital sources so that we could, afterwards, merge and correlate them with other business-related data. The goal remains the same. Then, the question is: are there any significant changes? Let's have a look:
### Twitter ---------------------------------------------------------------------------------------- # It is always a good idea to get to know how much time it takes to acquire data from a Social Network or any Digital Channel. # Measure-Expression is part of the framework too ;-) . Measure-Expression { $TwPosts = Get-TwTimeLine -name cveira -results 100 -quick } # 100 posts - 00:01:44. $TwPosts | Export-DataSet -SourceType TwPosts -label MyCampaign # Optional step. But it might save you time later on. # A couple of weeks later, let's see how many posts do we have from February ... $TwPosts = Import-DataSet .\DataSet-TwPosts-MyCampaign-2014220-1657-26-392.xml $TwPosts | where { ( [datetime] $_.NormalizedPost.PublishingDate ).Month -eq 2 } | measure # 65 posts # Instead of updating our full Timeline, let's update just February, because that's what we want. That would be much quicker! $TwUpdatedPosts = Update-TwPosts -from $( $TwPosts | where { ( [datetime] $_.NormalizedPost.PublishingDate ).Month -eq 2 } ) $TwUpdatedPosts | Export-DataSet -SourceType TwPosts -label MyCampaign # Let's persist the data, just in case ... ### Facebook --------------------------------------------------------------------------------------- # Let's run a quick Data Acquisition. It won't attach all the external information to the dataset, but it will be quicker. $FBPosts = Get-FBTimeLine -quick # 250 posts - 00:42:44 # We only want full data from posts published this year. $FBUpdatedPosts = Update-FBPosts -from $( $FBPosts | where { ( [datetime] $_.NormalizedPost.PublishingDate ).Year -eq 2014 } ) $FBUpdatedPosts | Export-DataSet -SourceType Facebook -label MyCampaign # Let's persist the data, just in case ... ### LinkedIn --------------------------------------------------------------------------------------- # Again. Let's run a quick data acquisition. $LINPosts = Get-LINTimeLine -results 100 -quick # 62 posts - 00:13:31 (non cached data) # We only want full data from posts published this year. $LINUpdatedPosts = Update-LINPosts -from $( $LINPosts | where { ( [datetime] $_.NormalizedPost.PublishingDate ).Year -eq 2014 } ) $LINUpdatedPosts | Export-DataSet -SourceType LinkedIn -label MyCampaign # Let's persist the data, just in case ... ### Merging Digital & Business Data ---------------------------------------------------------------- # Let's aggregate data comming from all channels. $FullDataSet = $TwUpdatedPosts + $FBUpdatedPosts + $LINUpdatedPosts # Let's load the existing campaign data from Excel (if any). $campaign = Import-ExcelDataSet -file .\MyCampaign-2014-0.xlsx -sheet campaign -DataStart 3,2 # Optional step: let's verify that our campaign hasn't been updated yet ... $campaign | where Channel -eq "Facebook" | select -first 10 | format-table Conversations, Likes, Audience, Downloads, Clicks -AutoSize Conversations Likes Audience Downloads Clicks ------------- ----- -------- --------- ------ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 # Let's update our campaign dataset ... $UpdatedCampaign = Update-DataSet $campaign -with $FullDataSet.NormalizedPost -using $CampaignRules # Optional step: now, let's make sure that our campaign has been updated ... $UpdatedCampaign | where Channel -eq "Facebook" | select -first 10 | format-table Conversations, Likes, Audience, Downloads, Clicks -AutoSize Conversations Likes Audience Downloads Clicks ------------- ----- -------- --------- ------ 5 40 2426 0 0 7 42 1873 0 0 2 45 1706 0 0 1 34 1608 0 0 12 44 1886 0 0 6 75 3368 0 0 2 58 1948 0 0 0 37 1672 0 0 6 55 2098 0 0 12 34 2378 0 0 # Finally, let's save our updated dataset into a new Excel file ... Export-ExcelDataSet $UpdatedCampaign -file .\MyCampagin-2014-1.xlsx -sheet campaign -DataStart 3,2
Notice how the process for each digital channel is exactly the same and the expressiveness of each operation… Now, you are ready to revisit my previous posts and compare for yourself.
Conclusion
I know that there is still a long way to go. But, fortunately, this new version of the Social Media Scripting Framework is not only more simple, but also more consistent, expressive and declarative. More “PowerShelly” in many ways ;-). In fact:
- With this new release it is pretty straightforward to write reusable “recipes” that, eventually, you could share with others.
- Now it is so much easier to build and maintain datasets that you can use to map digital media information with business data.
- Now it is realistic to think of feeding your analytical models with meaningful on-line information in a sustainable way because the resources needed to do it are significantly lower.
- Now, you can even start considering the delegation of certain parts of your Social Media Analytics process or automate them completely so that you can focus on more value added business activities.
- …
Sure, it is still a prototype with significant functional gaps. Nevertheless, I am confident that we have set up a solid foundation to build a powerful and robust tool capable of unleashing your digital data and boost your digital assets. But, hey!, that's what I think and, actually, it doesn't matter at all!. What matters is what you think! Do you believe that we have succeeded? Do you think that we are on the right track?
Picture: “Less is more” by othree. Licensed under CC-BY-20.