Page 1 of 1

The meaning af the green bars in the buttom.

Posted: Wed Aug 20, 2008 7:16 am
by mbk112
What does the two green bars mean in the buttom. I guess the second one is the strength of the signal, but what about the first? It is allways 100 % on my computer.

Regards
Michael

Posted: Wed Aug 20, 2008 11:57 am
by Juergen
Should be level and quality.
Both readings are taken from the driver used, so they don't have to work correctly for all tuners around.

Posted: Wed Aug 20, 2008 6:56 pm
by mbk112
Juergen wrote:Should be level and quality.
Both readings are taken from the driver used, so they don't have to work correctly for all tuners around.
What is the different between level and quality?

Regards
Michael

Posted: Wed Aug 20, 2008 10:23 pm
by Juergen
Well, I have to admit, both aren't exactly what their names tell, they are some kind of mathematical approximations.

Level represents the RF gain on the tuner input, more or less.
A % scale, of course, is nonsense there, as it refers to nothing.
Best to think of may be related with AGC regulator stage state.
However, the source of this value depends on driver and hardware.
Some other software products try to interpret as S/N rate in dB.
As noise also depends on band width, so does S/N ratio.
Level indication may but not must know...

Quality represents bit error rate, more or less. Down to a certain grade, that errors may get corrected fully, below they don't.
Depends on tuner and driver, as well.
Lower limit of usable quality may also be different for live TV and recorded files. Recordings need to be free from uncorrected errors.

A software, that wants to be compatible with as many tuners as possible, can't interpret such values perfectly.

In general, these are not really meters, just some kind of tuning indicators.

Posted: Thu Aug 21, 2008 7:08 am
by ter9999
Juergen wrote:Well, I have to admit, both aren't exactly what their names tell, they are some kind of mathematical approximations.

Level represents the RF gain on the tuner input, more or less.
A % scale, of course, is nonsense there, as it refers to nothing.
Best to think of may be related with AGC regulator stage state.
However, the source of this value depends on driver and hardware.
Some other software products try to interpret as S/N rate in dB.
As noise also depends on band width, so does S/N ratio.
Level indication may but not must know...

Quality represents bit error rate, more or less. Down to a certain grade, that errors may get corrected fully, below they don't.
Depends on tuner and driver, as well.
Lower limit of usable quality may also be different for live TV and recorded files. Recordings need to be free from uncorrected errors.

A software, that wants to be compatible with as many tuners as possible, can't interpret such values perfectly.

In general, these are not really meters, just some kind of tuning indicators.

The Technotrend Premium card has more reasonable and correct display of level and quality percentage than the Twinhan DVB-S cards.

Posted: Thu Aug 21, 2008 8:19 am
by Juergen
Sure. A bit more.
But readings of e.g. 13.6 dB can't be true, as that precision is not possible with such hardware.

And % is not really making sense for quality. % of what?
With a really good BER, you're getting more than 99.9% of the bits correctly even without the system integrated redundancy. FEC alone allows much more errors without any data loss. So what's 99% then, raw data or corrected data, or (what level of) correct frames or GOPs. Last one is what you need to be 100% precise, at least for recordings, that are made for authoring and similar.

Well, so far, I referred to the SS1.
But what does a widely compatible DVB application get, to display L & Q readings?
Basically some bytes from HW and drivers, that in most cases aren't specified in any way.
Means, some level values may be taken from tuner automatic gain control, or from some demodulator saturation reference, or from any dedicated stage wherever in the electronics, possibly even trying to somehow calculate signal vs. noise, whatever a HW developer may think of.
Then values are pressed into some scale, unknown min and max value, unknown scale factor, linear, log or whatever else unknown.
In the end that scale is transferred to a bit scale, that may go from zero to 100, or to 127, or to 255, whatever else...
That's what the application gets.

Different even between else identical receivers, that come with different tuner packs.
Not to be compared between different devices in any way.

And even identical tuner packs have tolerances, on frequency response and background noise, individual calibration and components tolerances, ...
Then, also LNBs do differ. On every stage inside, like feed, input stage, mixer, LOF noise of any kind, pola and band switching, IF output stage.
After all this, you get all sorts of cable losses, that aren't linear.

Not to forget, the gain of a normal dish rises with the frequency. So may e.g. (S+N):N

So, proper metering is a completely different subject.
And a display of tents of a dB is just as absurd as a portable radio with four A cells inside and a label "500 Watts P.M.P.O.".
What consumes 10 Watts max. from a power source, can't really play out half a kW without an integrated perpetuum mobile, flux capacitor or warp generator...