TRAI has proposed (pdf) fining telecom operators based on the degree of non-compliance to Quality of Service (QoS) standards, or “on the basis of how bad” a telecom operator performed rather than just fining them for an instance of non-compliance. Currently telcos are fined Rs 1 lakh for the first instance of non-compliance, and for consecutive or repeated instances, telcos are fined Rs 1.5 lakh for the second instance, and Rs 2 lakh for the third instance. This penalizing structure however does not take into account the severity or “on the basis of how bad” the performance is:
“The financial disincentive is same whether the benchmark is not met by 1% or 5%. One option towards streamlining the Quality of service parameters will be to explore the possibility of a scheme of graded financial disincentive so that in the case of very poor performance the financial disincentive could be very stringent,” TRAI suggested in the paper.
TRAI says telcos have repeatedly broken benchmarks: The regulator reviews a telecom operator’s performance based on compliance reports submitted by the telco, and it found that in many cases, “the amount of financial disincentives (fine) had not acted as a sufficient deterrent against non-compliance”. Telcos were found to have repeatedly breaking compliance norms or not meeting service quality benchmarks, and this according to the TRAI “indicates lack of commitment or initiative on the part of TSPs to improve the quality of service”.
The questions posed in the consultation paper are at the end of the artice, and the submissions have to be sent to firstname.lastname@example.org com by 26th of August, 2016. Counter comments will be allowed till 2nd September 2016
Creation of Quality of Experience parameter
TRAI has also proposed creating a Quality of Experience (QoE) benchmark for telcos, which is an aggregated benchmark that takes into account of key network quality parameters, data from consumer surveys, and data from compliance reports submitted by telcos. This would create a consumer-centric benchmark to understand if users are satisfied with a telco’s service. The Quality of Experience benchmark may include:
–Network Service Quality Index (NSQI), which is an aggregated index consisting of Call Dropped Rate, network availability, BTS downtime, congestions rates, and several other network quality standards.
–Customer Service Quality Index (CSQI) can include parameters like resolution of billing/ charging complaints, accessibility of call centre/ customer care, percentage of calls answered by the operators (voice to voice) within 60 seconds, Time taken for refund of deposits after closure, etc
–Customer Satisfaction Survey Quality Index (CSSQI), which can be calculated on the basis of multiple consumer survey over time.
“These different Indexes for cellular mobile telephone service…could then be used to evaluate the performance of the service providers on each parameter- based on a 10 point score by giving equal weightage to each parameter,” added TRAI in the paper.
QoS benchmarks could be leveled down to BTS or district level
TRAI has proposed BTS-wise, district-wise or locality level monitoring of QoS standards rather than just taking into account of an entire service area that a telco operates in. With identification of a telco’s service quality specific to a district or a BTS:
- Customers can take informed decisions by making them aware of quality of service in the areas he/she resides in;
- Telcos will be pressurized to invest more into infrastructure spending in a localized area wherein service quality is found to be very poor.
In order to evaluate a telco’s quarterly performance with reference to each QoS benchmark, a telco’s performance data is averaged each month per service area and assessed accordingly. However, currently this data only points out the overall performance of an entire service area, and not specific to districts, or smaller localities wherein the quality of service could be poor.
TRAI added that although most telcos are generally meeting the required benchmark of call drop rate of 2%, the situation isn’t the same when zoomed down to smaller levels. While zooming down to Base Transceiver Station (BTS) level, more than 12% of individual BTSs across the country had a call drop rate of above 2%, and approximately 1% of the individual BTSs had a call drop rate of more than 10%. But overall, the call drop rate across the country is 0.7%, added TRAI.
Below is an operator-wise, districts level data of telcos with more than 2% call drop rate:
Tweaks to existing QoS standards
A mobile station hosting a voice call also tracks the ‘link quality’ or quality of a connection. Transmission quality can decrease and eventually lead to a call drop if the ‘link quality’ is low. A ‘counter system’ counts every time a radio block is lost in transmission, and the link between the mobile tower and the cellular device is terminated if the counter value is equal to zero, which points out that there has been a Radio Link Failure (RLF), and eventually the mobile station might limit the number of calls it can host.
In order to turn the mobile station to normal, a telco sets a time-out value for the radio link to refresh and come back to working condition. TRAI says telcos can increase or decreased the timeout value depending upon the region— urban, semi-urban, rural areas :
“Though the RLT value (time out value) is normally set up as per the network, setting up high values for the same could lead to customer dissatisfaction. Normally this is defined, for areas of light traffic and large coverage (rural areas) to be between 36 to 48; for areas of heavy traffic (urban areas) to be between 20 to 32 and for semi-urban areas and in areas with heavy traffic (with microcells) to be between 4 to 16,” said TRAI in the paper.
Data analytics on Call Data Records
Call Data Records (CDRs) captures data including cell ID, signal levels, voice quality, duration of call, etc, and TRAI said that a CDR analysis on Delhi region pointed out that “more than 30% of the CDR data were available for less than 30 seconds only”. This could imply either the calls were made for a short duration or the call was dropped within just 30 seconds. It was also noted that some the calls were also repeat calls, which might indicate multiple failures in establishing a connection, added TRAI.
Through data analytics on CDR, it may be statistically possible to identify instances of call drops and TRAI plans on setting a QoS benchmark on the basis of the analytics performed:
“The CDRs with low signal level and poor voice quality and repeat of such calls within 30 seconds will give a clear indication of the call being dropped in the network. The TSPs could be mandated to identify such calls dropped in the network and calculate the call drop rate. TRAI could set a quality of service benchmark standards for the same,” added TRAI in the paper.
Questions for consultation
Question 1: In case QoS is mandated at a sub-service area level, which option (LDCA-wise or District Headquarter/ city/ town-wise or BTS-wise) you would recommend? Please comment with justifications.
Question 2: How should the call drop rate calculated – either at the Licensed service area level calculated during TCBH, or calculated during the Cell Bouncing Busy Hour (CBBH) at BTS level should be the benchmark? Please give your views on each parameter, with justification.
Question 3: How should the benchmark for the parameters be revised? Should it be licensed service area wise or district wise or BTS-wise or a combination? In such cases what should be the benchmarks? How should the benchmarks be measured? Please give your views on each parameter, with justification.
Question 4: How could the network parameters be technology agnostic? What are the parameters and benchmarks that are required to be defined? Please give your views with justifications.
Question 5: Do you think it is essential to mandate the TSPs to set the RLT parameter? If so what should be the criteria to set the value and the value that needs to be set. Please comment with justifications.
Question 6: Do you think it will be appropriate to calculate call drop rate through CDR meta data analysis? If so, what should be the benchmarks for such call drop rates calculated. Please comment with justifications. 27
Question 7: Do you think calculation of customer satisfaction index will help in QoE of the consumer? If so elaborate the methodology of the calculation of such indexes. What are the latent variable that need to be defined and how are they to be calculated? Please comment with justifications.
Question 8: What are your views on introducing a graded financial disincentives based on performance and what should be such quantum of financial disincentives for various parameters? Please comment with justifications.
Download: Consultation Paper