Total Posts:3|Showing Posts:1-3
Jump to topic:

what is CDMA

kiaralaurent
Posts: 1
Add as Friend
Challenge to a Debate
Send a Message
2/11/2016 12:23:01 PM
Posted: 9 months ago
Please give me some information about [url=http://www.seekzed.com...]
What is the full form of CDMA[/url] technology and its future and its current status.
triangle.128k
Posts: 3,628
Add as Friend
Challenge to a Debate
Send a Message
2/17/2016 2:17:16 AM
Posted: 9 months ago
At 2/11/2016 12:23:01 PM, kiaralaurent wrote:
Please give me some information about [url=http://www.seekzed.com...]
What is the full form of CDMA[/url] technology and its future and its current status.

CDMA and GSM are both radio systems used in cell phones. As biased as my statement might be, GSM is vastly superior since it's more commonly used around the world. It's also quite easier to switch carriers over GSM, as the data is stored on a SIM card rather than digitally. Not to mention that GSM is vastly faster, which is why AT&T and T-Mobile are faster than Verizon and Sprint.
Ramshutu
Posts: 4,063
Add as Friend
Challenge to a Debate
Send a Message
3/9/2016 3:52:26 PM
Posted: 8 months ago
At 2/11/2016 12:23:01 PM, kiaralaurent wrote:
Please give me some information about [url=http://www.seekzed.com...]
What is the full form of CDMA[/url] technology and its future and its current status.

When you use a mobile phone, you are broadcasting a signal at a frequency with a given signal bandwidth (and the base station is doing the same). You can't have every single phone attached to a single base station using a slightly different frequency, because you'll very quickly run out of frequencies to use. If you get two devices transmitting the same signal at the same frequency, you get interference, and you may not be able to obtain signals for either.

So, a way is needed of dividing up the frequency that devices are transmitting so they don't interfere with each other.

GSM/Edge does this using a method called Time Division Multiple Access (TDMA), this a pretty common 2G phone method, and is used by a significant number of different technology, including TETRA (Police Radios), and cordless telephones (DECT)

This method works by having the phone synchonizing the time they broadcast their signal with the base station, and being allocated a time slot in which to broadcast. This means your phone would use a particular frequency, and broadcast at specific times (the phone turning it's signal on and off in time is part of the reason you hear the "da-d-da, da-d-da on a TV or radio).

Now, that's fine for GSM, which has a relatively low bandwidth, but it has a big problem when you want to start moving data across the phone signal. If you want more bandwidth, you broadcast a signal with a wider frequency range (GSM is about 270khz, 3G phones are around 3.48 MHz), meaning you get more data.

However, now comes a problem. In GSM and TDMA schemes, it doesn't matter what the phone is actually doing, it still uses all the frequency range in it's time slot when you're using it, regardless of whether your sending a text message, making a phone call, or browsing the internet.

That makes for a LOT of wasted space that could be used for sending and receiving data.

For 3G phones, how this problem was solved is to have every phone transmitting at the same time when they needed to, but instead of having their signal divided in time, it's divided with a mathematical code called a Walsh Code. (Hence Code Division Multiple Access), it's not an "actual" code, but just a number that can be used to make sure they're transmitting at the right points. These codes allow all mobiles to broadcast at the same frequency but their data doesn't overlap or interfere with each other, but on the base station side every one can be reconstructed.

To solve the problem with TDMA, a CDMA mobile that needs more bandwidth for some reason can spread it's signal over multiple codes and take up more of the overall bandwidth; and a mobile that needs less bandwidth can be spread over fewer. This allows the system to dynamically respond to bandwidth needs of individual devices. CDMA systems include many 2.5G and most 3G phone systems, including 3GPP, UMTS, HS-DPA,+,++ and a few others.

That covers what CDMA actually is. But in terms of it's state of technology, it's on it's way out, for a few reasons:

1.) If you want to use CDMA, you have to pay through the nose to Qualcomm which have many or most of the patents and license everything.

2.) CDMA still isn't super efficient, as it still doesn't split up the signal fine enough without being prone to noise. (IE: the more data you try and shove through, the more small fluctuations effect the received signal, and require re-broadcasting).

LTE and modern Wireless LAN technologies, use a different method called OFDM : Orthogonal Frequency Division Multiple Access.

Instead of dividing up in just time, or in codes; OFDM systems split up their transmitted signal into many small signals that are very close together. Each device can be allocated a group of these frequencies for a certain period of time. If you want to stream netflix, your devices will be allocated a lot of the frequencies for most of the available broadcast time. If your just browsing the internet, you are allocated a few of the frequencies for short periods of time, then the network can allocate the unused portions to everyone else.

There is also the added benefit of being able to change the type of signal being broadcast to take into account how noisy the connection is. Most phone signals broadcast a number of bits at a specific rate; the bits being transmitted are determined by the phase and amplitude of that signal at a given time each collection of bit is called a symbol, and symbols are transmitted at a fixed rate. One of the most basic is called QPSK (Quadrature Phase Shift Keying), with 2 bits per symbol. It has 4 phases of +-45 and +-135 degrees with a fixed amplitude. The base station receives the signal and recovers the phase changes; if it at a given point the signal is at -45 degrees, the device has sent "00", if it moves to +135, that is a "01", -135, "10", and +45 degrees "11".

That's not very prone to noise, because the points are all far apart, and the amplitude of the signal doesn't matter a great deal.

However, in a system such as 256 QAM (Quadrature Amplitude Modulation), it is transmitting 8 bits per symbol. Using both the phase, and the relative amplitude of the signal to encode the bits. This gives you more data, but is very prone to noise, because the amplitude and phase of a "00011101" is very similar to "00011011", so if there are errors in your signal, you'll lose data pretty quickly.

LTE and other OFDM systems allows these schemes to be changed dynamically by a device, so that if you're standing under a basestation, you'll use 256QAM, but if you're far, far away and have a bad signal, you'll use QPSK. Less data, but the other side can actually read it!

Anyway, I hope that answers your question!