VideoPhill » tech http://www.videophill.com/blog Video logging, other blogging and other ventures Thu, 29 Jun 2017 17:27:36 +0000 en-US hourly 1 http://wordpress.org/?v=3.4.2 Finally – an Osprey Alternative http://www.videophill.com/blog/finally-an-osprey-alternative/ http://www.videophill.com/blog/finally-an-osprey-alternative/#comments Thu, 01 May 2014 15:14:37 +0000 mosmondor http://www.videophill.com/blog/?p=548 For years, video capture, at least for media monitoring companies, was dependent on Osprey capture cards. air max 1 femme They are the best there are in the field, and once you try it, you don’t look for anything else anywhere else. sac kanken pas cher You just pay the price and are satisfied with it. The card has excellent drivers with tons of options, SimulStream as an (paying) option, … real real beauty.

However, as we said above, it is pricey. nike air max pas cher For Osprey 460e, you need to hand out about $1200 USD. nike air max 2016 zwart That’s $300 per channel.

Now, click here:

http://www.vd-shop.de/simultaneously-capture-d130fpsinput-interface-rcabnc-inter-p-591.html

YES! 6 channels for 320 Euro ($400 USD). Nike Air Max 2017 Heren blauw I won’t calculate per channel price here, since it is already obvious that Osprey is beaten, at least as price is concerned.

In fact, lets see, on a setup of say 24 channels, how much do you save using new cards:

Osprey: 6 cards, $7200
VCAE: 4 cards, $1600

So only on capture hardware, you could save $5600. Add to that lower cost of hardware (servers) since you can pack everything in lesser amount of PCs.

So, to follow up on the excitement of finding that this card exists, I immediately ordered a sample to try it with our capture software. It came in few days, and we went on and installed it…

And the story has to end here, since card works as it should out of the box, enabling media monitoring installations to be even cheaper now.

]]>
http://www.videophill.com/blog/finally-an-osprey-alternative/feed/ 0
Setting up an automated ad monitoring service for TV http://www.videophill.com/blog/setting-up-an-automated-ad-monitoring-service-for-tv/ http://www.videophill.com/blog/setting-up-an-automated-ad-monitoring-service-for-tv/#comments Sat, 26 May 2012 08:57:48 +0000 mosmondor http://www.videophill.com/blog/?p=526 So you want to set up your own automated advertisement monitoring for some TV channels? And you probably have an idea how to sell the reports from the whole system? Let me try to explain one of the possible ways of doing it.

Overview

Advertisement monitoring system isn’t so complicated, but it isn’t simple either. You’ll need computers, people, and some kind of service to automatically track advertisements that are spotted once.

Recording

For starters, you have to be able to record all your needed TV channels. Depending on the TV system used in your country, you’ll have several options for it. From our shop, we can solve recording for analog tv, DVB-T, DVB-S, IPTV. In any case, if you can get composite video signal from your set-top box, you will be able to record it with VideoPhill Recorder.

Storing and archiving

Recorded broadcast should go to some storage, depending on the number of days that you want your broadcast archive to be available. nike air max 1 pas cher To calculate how much storage space you will need for it, you can use this on-line calculator.

Clipping and tagging

So now we have recordings of the TV broadcast. Fjallraven Kanken Big UK Next step is to form a team of people who will find and tag the first occurrence of an advertisement. Nike Air Max 2016 Goedkoop Number of people and workstations required for the job depends on many factors:

  • number of channels monitored
  • channel ‘difficulty’ (how easy is to find commercials on the channel)
  • number of shifts that people will do

In short, you’ll need some way of accessing the archive and clipping the portions of it in order to have clips of advertisements extracted and prepared for automated archive search.

One possible way of doing the job is by using VideoPhill Player application. fjallraven kanken To see it in action, please see video below…

Automated search

Almost there… Now, you have your archived broadcast, and you have your clip library. To find all of the occurrences of all clips on all your channels, you’ll simply pass whole archive and clip library to a PlayKontrol Service and get your results. Vêtements Armani Pas Cher Results can be in any format that you require, such as text, excel, PDF, XML, and so on.

Producing reports for your customers

Really final component of the system (apart from selling the reports) is a team of people who will use raw data that PlayKontrol will provide and produce nice reports for your customers.

]]>
http://www.videophill.com/blog/setting-up-an-automated-ad-monitoring-service-for-tv/feed/ 1
Creating a small (hopefully usable) utility: File Deleter http://www.videophill.com/blog/creating-a-small-hopefully-usable-utility-file-deleter/ http://www.videophill.com/blog/creating-a-small-hopefully-usable-utility-file-deleter/#comments Fri, 11 May 2012 19:57:13 +0000 mosmondor http://www.videophill.com/blog/?p=501 When you are in media monitoring, you have TONS of files. nike homme For example, look at this:

Multitude of files, StreamSink archive

Another bunch of files, created by PlayKontrol

Every recorder and logger and input produces a number of files on your system. Of course, each application such as VideoPhill Recorder or StreamSink have the option for deleting a files after they expire (one month for example), but what if you have other way of gathering information (media, metadata, something) that won’t go away by itself? I have several such data sources, so I opted to create a MultiPurposeHighlyVersatile FileDeleter Application. sac fjallraven kanken pas cher I’ll probably find a better name later, for now lets call it ‘deleter’ for short.

The Beginning

The application to delete files must be such a triviality, I can surely open Visual Studio and start to code immediately. Asics Gel Lyte 3 Well, not really. Adidas Dames In my head, that app will do every kind of deleting, so let’s not get hasty, and let’s do it by the numbers.

First, a short paragraph of text that will describe the vision, the problem that we try to solve with the app, in few simple words. That is the root of our development, and we’ll revisit it several times during the course of the development.

Vision:

‘Deleter’ should able to free the hard drive of staled files (files older than some period) and keep the level of hard drive space at some pre-determined minimum.

Here, it’s simple enough that I can remember it, and I’ll be able to descend down from it and create the next step.

The Next Step

For me, the next step (let’s say in this particular case) would be to try and see what ‘features’ does the app have. Nike Pas Cher The only way it works for me is to create a mock of the application UI and write down the things that aren’t visible from the UI itself. Comprar Nike Air Max Since this UI won’t do anything but gather some kind of parameters that will define behavior of the app, it will be a simple one, and it will be possible to fit it nicely on one screen.

For the sketch I’ll use Visual Studio, because I’m most comfortable with it. If it wasn’t my everyday tool, I’ll probably use some application such as MockupScreens, which is completely trivialized app sketching gadget with powerful analyst features.

The process of defining the UI and writing down requirements took some time, I repeatedly added something to UI, then to the list below, until I had much clearer picture what I’m actually trying to do.

Features:

  • it should have ability to delete only certain files (defined by ‘mask’ such as *.mp3)
  • it should be able to delete file by their age
  • it should be flexible in determining the AGE of the file:
    • various dates in file properties: created, modified, accessed
    • by parsing the file name of the file
  • it should be able to delete from a multiple directories
  • it should be able to either scan directory as a flat or dig into subdirectories
  • it should be able to delete files by criteria other than age
    • files should be deleted if their total size exceeds some defined size
      • in that case, other files should be taken into account, again by mask
    • files should be deleted if minimum free drive space is less then some defined size
    • file size
  • when deleting by criteria other than file age, specify which files should be first to go
  • should be able to support multiple parameter sets at one time
  • should run periodically in predetermined intervals
  • should be able to load and save profiles
    • profiles should have names
  • should disappear to tray when minimized
  • should have lowest possible process priority
And here’s the screen:

Mock of the Deleter UI used to define and refine the requirements

As you look at the UI mock, you’ll see some mnemonic tricks that I use to display various options, for example:

  • I filled the textboxes to provide even more context to the developer (myself with another cap, in this case)
  • I added vertical scrollbars even if text in multi-line textboxes isn’t overflowing, to suggest that there might be more entries
  • for multiple choice options I deliberately didn’t use combobox (pull down menu) – I used radio button to again provide visual clues to various options without need for interaction with the mock

From Here…

I’ll let it rest for now, and tomorrow I’ll try to see if I can further nail down the requirements for the app. From there, when I get a good feeling that this is something I’m comfortable with, I’ll create a object interface that will contain all the options from the screen above. While doing that, I’ll probably update requirements and the UI itself, maybe even revisit The Mighty Vision above.

BTW, it took me about 2 hours to do both the article and the work.

]]>
http://www.videophill.com/blog/creating-a-small-hopefully-usable-utility-file-deleter/feed/ 8
Testing 3rd party stream capture application http://www.videophill.com/blog/testing-3rd-party-stream-capture-application/ http://www.videophill.com/blog/testing-3rd-party-stream-capture-application/#comments Wed, 02 May 2012 16:13:26 +0000 mosmondor http://www.videophill.com/blog/?p=457 This is a response to a question from one of my prospects, and it can be summarized as:

Why should I buy StreamSink at $10.000 when there is Replay A/V that can do same thing for $100 (if I buy 2 licences for  2 computers)?

I can make several objections to the idea of having a consumer product in use for business purpose, but instead of that, I’ll try to focus on functionality (at least for this posting).

Purchasing and installing

I quickly purchased Replay A/V for $50, and went on to installing it. Scarpe Nike Italia Upon installation, it offered to install WinPopcap (to provide stream discovery) and some other utility for conversion of the saved material. Nike Air Max 2017 Dames I declined.

Entering stations

Once installed, I will try to copy my stream list into it and have it record it continuously.

After some investigation, I found out that there is no way to insert the list of the stations at once, so I’m going to enter them one by one.

OK, I entered Antena Zagreb with its stream URL, and went on to fiding the start button for it. I found it under context menu for the item that was on the list (right click, start-recording, …).

I remembered that I went through the options for a channel and found that you have to explicitly have to enter the option for splitting the file into segments, so I went on and did that.

I’ll leave it run now and will move on to enter the rest of the stations.

I was about 1/4 way down the list, then I got to WMA stream, and was really curious whether it will be accepted, since there is nowhere a option to pick a stream type. It was, and for now, it seems that it’s captured normally.

When I am entering the data into the software, and it does its file splitting at every 5 minute intervals, whole GUI freezes and becomes unresponsive for 2-3 seconds. zonnebrillen kopen ray ban What I am interested in is whether there will be a gap in the recording of the station that is cut. BTW, the computer I am doing the analysis at isn’t so weak…

Also, it seems that I entered a stream that doesn’t exists. Application is persistent in trying to connect to it, but while doing so, it freezes again for few seconds. nike air max 2017 pas cher However, it’s nothing to be alarmed about.

I also found out that in order for the app to be persistent about recurrent connecting, it has to be additionally configured, as it is not the default mode of the operation.

OK, so I finally entered all the stations. It gets rather annoying after few minutes, because on the 5 minute chunk interval, app gets its freezing moments rather frequently, and despite the fact it doesn’t pose a problem AFTER everything is entered, it really is annoying. Here is the filled up application:

Testing the recorded stuff

To do that, I will first share the folder with recordings so I would be able to see it from another (this) machine.

As expected, every channel is saved in its appropriate folder:

Now, let’s examine the contents of some folders that are recorded here…

First folder I have is Antena Zagreb, and here it is:

I won’t comment file naming now, but will tell you what happened when I double-clicked .m3u file that should have the list of mp3 files that are recorder. Winamp loaded it and CRASHED my machine completely. I don’t say it will crash yours, but my Winamp, when faced with certain media files that it can’t recognize, goes berserk. The problem here lies in the fact that Antena Zagreb has AACPLUS stream, and it was interpreted erroneously, creating mp3 files that crashed the Winamp. Here is one file for you to try, use it on your own risk.

Antena Zagreb Jan 02_05

Media Player crashed as well, but I could END it, with Winamp I had to restart the whole machine.

Last test I want to do in this post is to see if the subsequent files are saved so there is no gap between. Nike Air Max 2016 Heren zwart For that, I have to find a mp3 file that won’t actually break my player.

Found it, and had no luck. Even with pure mp3 files, Winamp gives up and puts its legs in the air. Tested the same with Media Player, and it seems that recordings overlap by few seconds, so that checks out.

Before conclusion, let’s just take a look at resource usage of the application:

Conclusion

You might be able to use Replay A/V for your media monitoring purposes, and save great deal of money. However, please note that:

  • I didn’t find any option for error reporting (which will enable you to see that the stream is off-line for extended time)
  • if all the channels would cut the file at the same time, it would create unresponsive app for at least 2*number_of_channels seconds
  • CPU usage profile is minimal, however I just found out that memory usage rises LINEARLY over time, and that would lead to immanent application death after some time (you do the math)
  • I didn’t use scheduler to create persistent connections, if I would, and am having bad connection with lots of breaks, app would be nearly impossible to use due to freezing upon connection
  • there is no (or I wasn’t able to find it) option for renaming the files so they would use some time-stamped names
  • it doesn’t provide support for VideoPhill Player, which is a archive exploration tool created just for Media Monitors

Additional info…

After several hours (around 6) this is the memory usage that is taken using Procexp.

For those that can’t read memory usage graph, this means that the application has a memory leak, and by this rate, it would exhaust its memory in less then 24 hours, since it is x86 process.

]]>
http://www.videophill.com/blog/testing-3rd-party-stream-capture-application/feed/ 0
Treatment of repeating content http://www.videophill.com/blog/treatment-of-repeating-content/ http://www.videophill.com/blog/treatment-of-repeating-content/#comments Sat, 28 May 2011 20:26:47 +0000 mosmondor http://www.videophill.com/blog/?p=242 In media monitoring systems and environments, we often have to identity and COUNT the occurrences of some playback event. Most common examples of such are when you have to monitor all the occurrences of the same commercial audio spot.

Multiple parties are interested in tracking audio spots:

  • broadcasters
  • advertisers (clients)
  • agencies (clients representative)
  • government regulators

Let me briefly cover what do they need to know about playback of the commercial audio spots.

Broadcasters

They need proof that they played something at a certain time, to show it to the client and be able to issue invoices for services provided.

Advertisers and agencies

They both need to have a proof that something was played – their own commercials, at certain times, and by correct amount. However, they might also need to be able to look into other brands so they can track their competitors.

Government regulators

They usually want to know if the proposals or laws requirements on the broadcast media is met. New Balance 574 Pas Cher Such requirements are for example to have no more then 2 minutes of advertisements per hour, or to have commercial blocks clearly separated from the rest of the program by special markers called ‘jingles’ or ‘breaks’.

Let’s get back to..

The problem

Usual workflow for the above is to fill the matching technology with a samples that you want to track, and the technology will give you the locations in the timeline for the samples provided. Adidas Femme That is one thing that PlayKontrol can do for you. sac fjallraven kanken But, what if you don’t have the samples, and still want to discover them?

The rescue

Traditional way would be to go through the known parts of the program, mark them, clip them out and have audio spotter search for all of the occurrences. With that method, and with lot of clipping, you’ll have some accuracy, and some clips will miss you attention because they aren’t in their place, for example commercial is out of its commercial block.

Other way is to do it with PlayKontrol SelfMatching technology. Adidas ZX Flux Heren It works in a way that whole day of archive is given to the PK, and the result of the process is a list that contains ALL of the matches for the given day.

So every repeating audio clip, no mater how small, will be listed here. Nike Air Max 2016 Heren grijs From there, your analyst only task would be to:

  • browse through the clips,
  • listen to them,
  • maybe fine-clip them,
  • tag them and
  • put them into the repository.

I have created a picture containing the results of the process in ‘visual representation’. nike air max pas cher Here it is:

Please note the following:

  • both X and Y axes represent time
  • grid divides time in one-hour interval
  • grayed areas are time intervals 00-06 and 18-24 (say night time)
  • size of the points represent length of the clip that is matched

Try to figure out the rest for yourself.

]]>
http://www.videophill.com/blog/treatment-of-repeating-content/feed/ 0
Capturing and archiving of DVB-T signal http://www.videophill.com/blog/capturing-and-archiving-of-dvb-t-signal/ http://www.videophill.com/blog/capturing-and-archiving-of-dvb-t-signal/#comments Sat, 28 May 2011 15:13:03 +0000 mosmondor http://www.videophill.com/blog/?p=236 No matter if it’s for compliance recording so you will capture and save your own broadcast, or you are doing media monitoring and you would like to capture multiple signals of the air, you have some interesting choices here.

Let’s explore in detail your options on the subject, whether it’s one channel or multiple channel recording.

One channel DVB-T recorder

Recording of one channel is simple no matter how you choose to record it. Let me present two main options here for you, so you could see what is most applicable in your situation.

Simplest way of recording would be to have one set top box for DVB-T, and use it to send composite signal into the computer via the Osprey 210 card. It is the most robust solution, but it has some (serious) drawbacks:

  • cheap DVB-T tuners can ‘lock’ and freeze the picture
  • low quality tuners can also de-sync audio and video with time – and you need 24/7 operation here
  • you’ll need extra power connector for the set top box
  • STB-s are producing extra heat

Alternative way of recording is to use DVB-T card such as Asus MyCinema-ES3-110, use software such as TubeSink to tune on a frequency and extract the channel required from it (this is called DEMUX-ing) and forward the extracted channel to the VideoPhill Recorder for further processing (recording, streaming, …).

BTW, TubeSink mentioned above can be used even without VideoPhill Recorder, as it DEMUXes the channels and can forward them to any computer on your network as an UDP Transport Stream that can be playable with VLC. Nike Air Max 2017 Heren grijs It you want to use it for non-commercial purposes, download it from here.

So in the case on 1 channel DVB-T recording, I would say that it remains uncertain whether to use external set top box with Osprey capture card, or go with pure software solution and some simple of-the-shelf DVB-T tuner.

But in case of…

Multiple channels DVB-T recording facility

Same options are available at multiple channel recording facilities – but here is the catch. As you might probably know, multiple DVB-T channels are packed and are transmitted at one frequency and that is called multiplexing. The carrier for the channels that are transmitted is called MULTIPLEX (MUX for short). nike air max 2017 pas cher In several occasions it has 4 channels, and sometimes it can have as much as 16 or more channels.

Current recommended recorder density (channels per machine) is 4. Adidas Gazelle Heren One machine packed with Osprey 460e will do 4 channels just fine.

So, let’s say that we need 16 channels and they are scattered across 3 MUX-es (we have such situation here in Zagreb). Using a conventional method (I would say that having 16 STBs is conventional, as bizarre as it seems) you’ll need the:

  • 4 recording servers
  • 4 Osprey 460e cards
  • 16 DVB-T set top boxes
  • PLENTY of mains outlets
  • some kind of distribution to have the signal distributed to all 16STBs

Since you see where I’m coming to, let me suggest the following; let’s use TubeSink to control 3 tuners in TWO MACHINEs, and save on 4 Ospreys and 2 PCs, and the rest of the unnecessary equipment.

We’ll put 2 tuners into one machine, and one tuner in the second machine. kanken fjallraven If the channel per MUX distribution is such that each machine has it’s 8 channels, fine. Nike Air Huarache Heren If not, we’ll instruct TubeSink to forward the Transport Stream to ANOTHER machine and that machine will perform recording. In that way, load will be completely balanced between two machines, and you’ll have your 16 channels recorder in a nice and compact fashion.

Even more compact?

Yes, it can go even further. There are dual DVB-T tuners such as WinTV-HVR-2200 that can provide tuning to two frequencies at once, and with it, you could record as much channels there are in two MUXes at one machine. Nike Air Max 2017 Homme Today, even desktop processors such as i7 can encode 8 channels of video in real time. So, with proper CPU (or multiple CPUs on server computers) – even 16 channels could be encoded in one compact 2U rack mounted unit.

However… (serious problem)

Using PC based DVB-T cards will only work with free to air channels.

]]>
http://www.videophill.com/blog/capturing-and-archiving-of-dvb-t-signal/feed/ 1
SD-SDI compliance recording http://www.videophill.com/blog/sd-sdi-compliance-recording/ http://www.videophill.com/blog/sd-sdi-compliance-recording/#comments Sat, 28 May 2011 14:04:04 +0000 mosmondor http://www.videophill.com/blog/?p=226 I will try to write something about creating the archive for the compliance recording purposes from the SD-SDI source. This post is in a response from an repeated inquiries about system that would create such an archive.

Compliance recording in general

Just to remind us – compliance recording serves only one purpose – to prove or disprove that something went on the air at some time. That is the first and the last thing that this technology is used for. nike air max 2017 goedkoop There is no any requirement on the content of the signal – it just have to be good enough for someone to see basic stuff that’s in there .

This one business requirement is countered from the other side with need to create the archiving system to be as affordable as possible. And the cost of the system, if we count out the software involved (system and application part) is highly dependent on the:

  • picture quality
  • number of channels recorded
  • number of possible outputs needed
  • and days of the archive that should be kept

Picture (ie. video) quality is directly proportional to the BITRATE of the video that is recorded. More bitrate, more apparent quality, both in picture resolution, object motion, and so on. cheap fjallraven kanken uk For example, most recorded videos that you might have on your computer are in between 700 and 1000 kbit and are recorded with DivX or similar encoder. Watching a movie at 1000 kbit bitrate is highly enjoyable, and everything above that is required ONLY for HD or HD-Ready content.

Compliance recording means recording what’s on the air

Many people forget that when you do compliance recording, you have to record what was coming out of your transmitter array, not the final that you are broadcasting. In case your link went down, or your output HF amplifier burned, your endpoint picture will be NOTHING, and if you are recording your output, there will be a great discrepancy in your logs.

So suggestion: drop the thinking about recording SD-SDI, and try to record what comes back from the air, in the form of the analog or DVB-T signal. That is the REAL broadcast that your viewers see.

The conflict of interest

We all want stuff to be as cheap as possible. In order to build a recording system that is affordable, we have to balance between various things, but here I’ll try to explore the differences when SD-SDI signal capture is required.

I will postulate this input parameters for the archive that will serve as example here:

  • video bitrate: 512 kbit
  • audio bitrate: 64 kbit
  • archive duration (number of days to keep from today): 92 (3 months)
  • number of channels that needs recording: 4
  • required outputs: both WMV (Windows Media Video) and h.264 (at the same time)
  • recording resolution: 3/4 of PAL D1 picture size so: 540×432

In order to calculate hard drive space requirements for this here, I’ll use the online bitrate calculator here on this blog.

Table below taken from the calculator itself.

Video Bitrate Audio Bitrate Total Bitrate Days Channels Disk Space
512 kbit/s 64 kbit/s 2304 kbit/s 92 4 2183.20 GB

So we need 2200GB of net drive space. In order to provide that kind of space with some fault tolerance, I would recommend having two drives such as WD CAVIAR GREEN 2500GB connected in a MIRROR VOLUME (RAID1). There is no need for additional system drive, and almost all new motherboards can provide on-board implementation of RAID1 – mirroring.

I’m kind of beating around the bush here – because main point of this article should be to provide you with a choice of whether to use original SD-SDI signal or convert to composite signal (or use the signal that comes back).

Osprey vs DeckLink

The choice that lies before us is either to use Osprey 460e card, or a DeckLink Quad card. List price for the Osprey is about $1200, and two DeckLink Quad card is about $1000. So, the first impression is that you’ll go cheaper with DeckLink. But…

So far, the Osprey has proven itself to be absolute master of video capture. Nike France Every system deployed so far didn’t have any problems with the card whatsoever. Tennis Nike France You plug it in, install drivers, and it works. adidas shoes uk And with Osprey 460e, you’ll use only ONE PCI-e slot. Fjallraven Kanken Big Other slot, if there is any, will be used by the graphics card.

Same thing about the slots applies to DeckLink as well. However, there is “now shipping” image below the DeckLink Quad card name, and it implies something – it is red-hot new.

]]>
http://www.videophill.com/blog/sd-sdi-compliance-recording/feed/ 0
Recording multiple FM radio stations (works for AM, too) http://www.videophill.com/blog/recording-multiple-fm-radio-stations-works-for-am-too/ http://www.videophill.com/blog/recording-multiple-fm-radio-stations-works-for-am-too/#comments Thu, 26 May 2011 23:10:19 +0000 mosmondor http://www.videophill.com/blog/?p=404 As it seems, we in media monitoring want to record everything. nike air max pas cher Good part of everything is still in FM radio spectrum (or AM in some flat-land countries). An usually, there are plenty of stations on the air that we have to record, at least a dozen at a given location…

Ancient history

Many many years ago, when I was working in FirePlay (great radio automation company and software) we had a task to produce a recorder that would record ONE channel of radio program 24/7. At that time, encoding MP3 in real-time was some kind of science, and wasn’t available but on most advanced systems that were available (I won’t try to be exact here, but it was something on the lines of Pentium 133Mhz).

So we build FireSave, first version, that was able to handle 1 channel and record it to hard drive, encoded in mp3 format. We even tried to use some obscure GSM codecs to save space even more…

Ancient history, but without dinosaurs

Setup above required live external tuner to be connected to the Sound Blaster (yeah, really). We had some multi-channel cards but they were expensive, and using them to record a confidence and/or compliance recording would be waste of money.

Our need was expanded from one channel to several, say 4. Since we had some expertise running multiple channels, we quickly added more external tuners, replaced Sound Blaster with some multi-channel monster (it was Wave4, then Gina24, then other stuff from EchoAudio, such as Layla 3G) and finally upgraded the software so it could handle multiple channels.

It worked, with 4 external tuners attached to one PC, sometimes more, it looked like an octopus.

Present days (year 2009)

OK, but what if you need and want to record 150 radio station that typical country like Croatia has? You’ll be able to get some audio cards that will have up to 16 audio inputs (even mono sound will be OK), but to have that kind of external tuners, that is and could provide some kind of a problem. Scarpe Adidas Italia And yet still, they can’t all be heard in one place, so you’ll have to have multiple recording sites in order to capture everything you need.

Or not?

The simple fact is that every good radio station will have its internet stream so it will be heard on the internet. nike air max 2016 goedkoop And there is a way to capture that stream of the internet and save it to hard drive as you would record it. There are multiple tools on the internet that would allow you to capture internet audio streams, and you just have to choose one of them, and you’ll be able to record any radio that has its stream. Before we created StreamSink, I was extensively using StationRipper for my own purposes, and that was the inspiration that was needed to create very similar tool. It is similar in the respect that it records internet audio (and video) streams, but one thing is very different: all ‘rippers’ including StationRipper are designed to try to cut audio stream at song boundaries, creating a library of songs for the user. On the other side, our task was to create system to record internet streams in multiple formats in the archive format usable by VideoPhill Player.

It isn’t anything special – just a bunch of files named in some fashion and cut at every five minutes, with special care not to lose single byte of a stream while cutting it.

So with that system, recording 100 radio stations on a single computer is as simple as having an good internet connection present. Of course, every stream will be recorded as reliably as the server and the internet permits, and there is nothing you can do about it. When using that method, you must allow yourself to lose some of the archive sometimes, for the unforeseen facts. Again – better radio stations (the stations that you will need 100% of the archive) will have better sources, better distribution servers, and thus your archive will be better covered.

Expected Archive Coverage

But what to do when there are NO streams?

Lately (Summer 2011), there was a client that needed to record multiple radio stations as well. However, after initial investigation we concluded that radio stations that needed recording were either badly presented on the internet or not presented at all. So instead of capturing streams, we were aiming to capture radio signal from the FM directly. All we had was the antenna that was dipped in the airwaves that contained our radio stations (8 of them).

Strategy was as follows: I have a tool that can capture streams in the format that my application (the Player) needs, but we haven’t the streams. Let’s create them.

Shoutcast internet radio is on the market for decades. Goedkoop Adidas Schoenen And it has both free and tremendous support, and their software for creating and distributing internet radio streams are as robust as they can be, since they are field tested in possibly millions of usage scenarios.

As I knew how to encode the stream, how to distribute it (locally) for the StreamSink, I just needed to capture FM signal somehow. Adidas NMD Heren Using 8 external tuners would be funny for the client, and I’ll probably lose them, so I did a little digging and found a beauty in form of a PCI card:

Professional PCI tuner adapter

This little monster (AudioScience ASI8921) is able to capture 8 FM radio channels and give them to the rest of the system in the form of the DirectShow or waveIn API, just what we needed.

]]>
http://www.videophill.com/blog/recording-multiple-fm-radio-stations-works-for-am-too/feed/ 16
Refactoring, it’s fun – part 2 http://www.videophill.com/blog/refactoring-it%e2%80%99s-fun-%e2%80%93-part-2/ http://www.videophill.com/blog/refactoring-it%e2%80%99s-fun-%e2%80%93-part-2/#comments Sun, 22 May 2011 22:10:06 +0000 mosmondor http://www.videophill.com/blog/?p=218 In the first part I tried to set the scene and give you some background of my problem. I’ll try to continue now, and create an interesting and at the same time informative story about performing the refactoring in question.

Most curious of all is that you don’t have to prepare for refactoring. Kanken 20L It can happen no matter what, and the price usually isn’t so high after all. nike air max pas cher Let me remind you – main tool of component design that is dependency injection wasn’t used. Yes, classes do have their responsibilities cleverly defined, and it helps a lot here, because if it weren’t so – whole deal would have to start few steps ‘before’.

I’m not an experienced writer, so I don’t know if I will get the point across, but to me refactoring this was like building level of level of scaffolding, just to use one level to test the next, and at the same time creating scaffolding so it would be used later in a production environment! I guess that there is a name for it, it has to be :)

Step 1:

Creating a duplicate of the main working class, and see how to force the rest of the application to use it when needed.

Say that class name is PCMHasher, and it has following structure (method declarations only):

class PCM_Hash {  public bool Initialize(PCMBufferedReader sourceDataReader);  public uint GetNextHash(); }

My goal was to create alternative class to this. nike air max 1 homme I needed the old class to be able to have some reference to get the results from.

So I created class PCMHash_2. Nike Air Max 2016 Italia That was my first decision – to create same class as before and try to get same results from it, replacing its guts one step at the time.

Using replaced version wasn’t easy, so I took an interface out and derived from it, and created something like:

IPCM_Hash rph; if (!_refactoring)  rph = new PCMHash(); else  rph = new PCMHash_R2();

At this time I would like to re-state the fact that I am in the production all the time, and have to decide on my feet, having to weight out all the implications that would arise from it. nike air max 90 pas cher I am telling that because everyone could see that some dependency injection was to be used here. nike air max 2017 heren grijs But, apart from having to spend much time installing it through the code, I’m not sure how and if it would work anyway.

Why: at a testing time, I want both classes to be able to function side by side.

]]>
http://www.videophill.com/blog/refactoring-it%e2%80%99s-fun-%e2%80%93-part-2/feed/ 0
Refactoring, it’s fun – part 1 http://www.videophill.com/blog/refactoring-its-fun/ http://www.videophill.com/blog/refactoring-its-fun/#comments Sat, 21 May 2011 23:04:55 +0000 mosmondor http://www.videophill.com/blog/?p=212 It’s a story of refactoring when code that should be refactored isn’t prepared for it a single bit. If I say prepared I guess that it would mean to have test cases, dependency injection code, and so on. Nike Air Max 2017 Dames roze However, I have none of the above in the original code, just the code that works.

Let me explain what I have, what it does, and where it should end. chaussures de foot puma The purpose of this here refactoring session isn’t about having better performance and keep the same functionality – it is to have same algorithm, already proven in various tests work on different data.

Before (a.k.a. nike air max 2017 zwart now):

Component creates PK_HASH from a sound file. By PK_HASH I would mean “code name for our latest tech that can crunch whole audio file to few bytes later to compare that bytes to bytes crunched from the other file and tell you whether it’s the same sound file”. PK stands for PlayKontrol – the brand name.

So, there are few steps to produce the PK_HASH from sound file:

  • decode and read the file – input is file on the disk, for example .mp3, .wma, .aac, and the output are PCM samples
  • from any kind of PCM samples (stereo, mono, 8-bit, 16-bit) we produce array of shorts that must be rewindable
  • hashing the data and producing the PK_HASH files

Decode and read the file

From the file on disk, that can be any file format that is streamable (we produce files with StreamSink – see the archive example here: http://access.streamsink.com/archive/). That is .mp3, .aac, .wma, .ogg, and whatnot.

Currently it’s done by using simple component that uses DirectShow to create a graph for the audio file, renders that graph and attaches SampleGrabber filter to fetch the samples. Component goes from file on the disk to whole PCM sample data into memory. It’s feasible for 5 minute file (5 x 60 x 4 x 44100 = 50MB). It can work even for 1h files. However, you can *feel* that this approach IS wrong, especially when told that for the rest of the algorithm, you don’t have to have access to the whole PCM data at once.

Rewindable sample array

PCM Samples are promoted to 16 bit (if needed), and channels are downmixed to mono. Again, that is done in memory, so for each PCM sample there are 2 bytes of data that are present in memory as a result of the operation.

Hashing and creating the file

Hashing process needs moving and overlapping window over sample array, and since we have everything in the memory, that is a piece of cake now. We take the data, process it, write it into the another byte array. Nike Air Max Goedkoop Since it’s extremely dense now, I won’t cry about memory at this point, but yeah, it is written into the memory first, and then saved to file on disk.

So here I tried to explain how it works so far. It goes from the encoded audio file to PCM sample data in memory, downmixes that data in memory to one PCM channel, processes the mono PCM samples to obtain PK_HASH and then write it to file.

So what do we actually need?

If you take a peek at the archive you’ll find that every folder has audio files, and also has .hash file for ever audio file that is present in the directory. Please note that not every directory is processed, only 20 of those, because processing consumes CPU intensely, and I have only few PCs laying around to scrub the data. Fjallraven Kanken No.2

Will improve in the future. So, for crunching the archive, even POC (proof-of-concept) is OK, as it serve its needs. It will go through the archive and leave processes PK_HASHes.

Process that goes in parallel is waiting for the PK_HASH file to be created, reads it, and does matching against the database. However, next step should be taken, and it is REALTIME processing.

To be able to process in REALTIME, architecture goes somehow like this:

  • StreamSink is attached to the network stream of any kind, and provides PCM sample output
  • PCM sample output is downsampled and buffered
  • hashing process uses buffered mono PCM samples and outputs results into the stream
  • PK_HASH stream is again buffered and results processed with MATCHER process

StreamSink PCM decoding

StreamSink is the application that does internet media stream capture. It can, however, thanks to feature request from DigitalSyphon, process every media stream and provide PCM samples for it in real-time, in a form of the Stream derived class. So, what part of the process is covered completely.

Buffering PCM samples

Now, new component should be created – something that can buffer PCM samples from the Stream and provide floating, overlapping window reads for hashing process. With some thinking I combined inner workings of Circular Buffer stereotype with something that can be used almost directly in the hasher process – by replacing class implementation only.

Processing and creating PK_HASHes

Hasher process was reading the buffered MEMORY quasi-stream. However, it used kind-of simple interface to read the data, so my luck was that that interface could be extracted and implementation done with buffered stream data. Also, output of the class should be rewritten, since it now doesn’t have any Interface-able part to replace.

And so on – later should be implemented from scratch, so there is no story about refactoring here.

Refactoring pyramid/tower

I can call it pyramid or tower, because after long time of procrastination (subconsciously processing the task at hand) I was able to put my hands on the keyboard and start. My premise was that everything has to be checked from the ground up, because NOW I have the algorithm that produces desired results, and since there are many steps involved, an error in a single step could be untraceable if I don’t check every step along the way.

Tools used

I am kind of old fashioned, so this paragraph won’t be very long.

]]>
http://www.videophill.com/blog/refactoring-its-fun/feed/ 0