cancel
Showing results for 
Search instead for 
Did you mean: 
1

Ask

2

Reply

3

Solution

Slow Speeds And Strange Hsdpa (light-blue/dark Blue Led) Switching

techmind
4: Newbie
Hi,

During the past week (since around 19th Sept) my downlink data throughput for large files in the evening has been often less than the usual 1.4Mbps, often down to a few hundred kbps. Is this due to excess load on my base station (CB22 5BP) due to the students returning to university, or is there a network issue?

Another odd thing is that in the past 3-4 days, my dongle (E220) has sometimes been remaining in dark-blue 3G rather than jumping light-blue 3G+ (HSDPA) when I'm using the connection. Sometimes it'll be in light-blue, then switch to dark blue when I start a download or watching a streaming video (which then, unsurprisingly, breaks up and becomes stuttery). I'd never seen that before this week. (This has been happening in the evenings, and now at 11:50 on Saturday morning.)

I'm still in the same place, a hundred yards or so from the base station, with 3-4bars of signal.
Any ideas?

Thanks,

Andrew
13 REPLIES 13

bacupian
4: Newbie
If I clip the modem to just the right spot on the curtains I can get an RSSI of 22, i.e. -69dBm. (or even 25, i.e. -63dBm)

Heady, should -79 to -69dBm be "plenty" for good high speed data? VMC shows 3-4 bars at comparable strength.

And would that be consistent with a signal from a microcell 100 yards away (but perhaps sheilded by a rather large house)?
And if the modem is potentially receiving from more than one base station, which figure is reported? (the strongest, I guess?)


BTW Heady, you didn't used to work at a company with postcode xxx5HA at some point did you???


I had 3Mbit/s and AT+CSQ giving 7:99

Mostly I get strength around 11 and maybe 1.5Mbit/s

David

heady
4: Newbie
... see how strong a signal I can get from there. I could also write my own little program to monitor the status chatter from the E220 rather than having to watch a crude terminal display 🙂

I already do that - quite interesting to see what happens over the day. (RDDtool or Munin or Cacti)

Maybe for debugging purposes I should take up VF's offer of a newer modem.

Just borrow your friend's modems... The E220 is one of the better modems for UNIX'y type systems and in my opinion more robust than many.

heady
4: Newbie
... should -79 to -69dBm be "plenty" for good high speed data? ...

First off - we need to understand what the modem RSSI value means:

From the relevent 3GPP standard:TS 27.007 V8.3.0 (2008-03)
RSSI

  • 0 -113 dBm or less
  • 1 -111 dBm
  • 2...30 -109 to -53 dBm
  • 31 -51 dBm or greater
  • 99 not known or not detectable.

Or use Techmind's equation:
dBm = (rssi*2)-113

Therefore; a change in RSSI of 1 is an improvement in signal strength of 2dBm. (Closer to 0dBm is better).

The value in dBm is the loss of power from when the signal left the amplifier electronics at the NodeB to being received at the electronics of the modem. This power loss is called the path loss. The minimum dBm value (biggest -xxx dBm value) that a radio device receives at is called the receive sensitivity. The link budget is then the addition of all the losses and gains within the transmit path - antennas provide a gain and are +ve; while air, cables, trees etc... produce a loss and are -ve. The link budget available then (i.e. the max distance for reliable communications) is the receive sensitivity + margin.

Personally - I have seen systems quite happily work almost right on the receive sensitivity; and for NodeB's from a few years ago this was pegged at about -125dBm. They have probably improved since then - so depending on when the sites were installed in your area - may change the max uplink receive sensitivity possible for reliable communications.

This works both ways - uplink (from modem to NodeB) and downlink (NodeB to modem).

However, the problem is that the downlink is really restricted by the cost/quality ratio of the modem equipment - to some extent. However, mitigated somewhat by having a high power transmitter at the NodeB. The real problem for any consumer mobile system is the uplink as the power available at the modem is usually very limited.

A really neat little demonstration of distance and path loss is provided here:
(http://www.osischool.com/protocol/wireless/pathloss/index.php)
The differences between 802.11bg and UMTS/HSDPA are the frequencies used. UMTS/HSDPA use 2.1GHz and 802.11bg use 2.4GHz. The higher the frequency - the higher the attenuation in free space for the same distance. So UMTS/HSDPA will have less path loss than 802.11bg - or will in effect travel further for the same path loss.

And would that be consistent with a signal from a microcell 100 yards away (but perhaps shielded by a rather large house)?

You could try to plug in your values to the equation provided in the example above (remember to change the frequency value to 2.1GHz).

Anything better than approx. -50dBm is considered "perfect" and "real bad" is considered worse than approx. -113dBm; therefore, -69dBm to -79dBm being about mid-range is what I'd term "good".

And if the modem is potentially receiving from more than one base station, which figure is reported? (the strongest, I guess?)

For GSM - you would be correct. But for UMTS/HSDPA this is not the case. As 3G communicates on the same frequency for every cell (depending on network design) the RSSI value will be the combination of the signals from all surrounding cells. For GSM the signal meter actually has some value; however, in my opinion for UMTS/HSDPA the signal meter is really a placebo. As it is not the signal level that matters but the "quality" of the signal being received.

For a simplistic example: with UMTS/HSDPA you could be in the middle of the desert and have the lowest signal strength possible. In this situation the signal "quality" might be "perfect" and the system would work quite well. However, you could be in the middle of the city and have the highest signal strength possible but the "quality" might be horrible and therefore, the system would not work that well.

This was one of the theoretical reasons a few years ago the Australian Government decided to use CDMA/WCDMA to replace the analog POTS network in the rural areas of Australia. Theoretically CDMA/WCDMA communication distance is affected by signal "quality" and signal strength - not by propagation delay or the design of the system.

However, with WCDMA/HSDPA for "high speed" data transfers there is more to this than just signal "quality" and pure signal strength. Signal strength is just a small part of the overall equation.

BTW Heady, you didn't used to work at a company with postcode xxx5HA at some point did you???

To borrow a phrase: "Unfortunately - I cannot confirm or deny"

heady
4: Newbie
...

This power loss is called the path loss. The minimum dBm value (biggest -xxx dBm value) that a radio device receives at is called the receive sensitivity.

...

This works both ways - uplink (from modem to NodeB) and downlink (NodeB to modem).

...


There was a mistake in the above post (cannot edit now). "(biggest -xxx dBm value)" should be "(smallest -xxx dBm value)".
Biggest would be closest to 0dBm (best) - and smallest would be closest to -999 dBm (worst).

After a bit of "Googling" for some WCDMA/HSDPA modem spec sheets it would appear that "Best Quality" modems have a receive sensitivity of better than -115dBm; whereas, "Good Quality" modems have a receive sensitivity somewhere around -110dBm. But it would appear that mass produced modems lie around the -95dBm to -105dBm mark. The Huawei E220 appears to have a receive sensitivity of approx. -104dBm.