ISO/IEC JTC 1 SC 42 Artificial Intelligence - Working Group 4
Use Cases & Applications
   04/19/2024

Editor's comments and enhancements are shown in green. [ Reviewed]

The quality of use case submissions will be evaluated for inclusion in the Working Group's Technical Report based on the application area, relevant AI technologies, credible reference sources (see References section), and the following characteristics:

  • [1] Data Focus & Learning: Use cases for AI system which utilizes Machine Learning, and those that use a fixed a priori knowledge base.
  • [2] Level of Autonomy: Use cases demonstrating several degrees (dependent, autonomous, human/critic in the loop, etc.) of AI system autonomy.
  • [3] Verifiability & Transparency: Use cases demonstrating several types and levels of verifiability and transparency, including approaches for explainable AI, accountability, etc.
  • [4] Impact: Use cases demonstrating the impact of AI systems to society, environment, etc.
  • [5] Architecture: Use cases demonstrating several architectural paradigms for AI systems (e.g., cloud, distributed AI, crowdsourcing, swarm intelligence, etc.)
  • [6] Functional aspects, trustworthiness, and societal concerns
  • [7] AI life cycle components include acquire/process/apply.
These characteristics are identified in red in the use case.

No. 31 ID: Use Case Name: Autonomous network scenarios
Application
Domain
ICT
Deployment
Model
Cyber-physical systems
StatusPoC
ScopeCommunications network
Objective(s)Clarification and showcases of autonomous network usage
Short
Description
(up to
150 words)
Multiple scenarios of autonomous network enabled by AI is addressed for improving operational efficiency, customer experience and service innovation, including wireless network performance improvement, optical network failure prediction, data center energy saving etc.
Complete Description The leading reason to adopt AI-assisted network automation is to reduce the cost – almost 80% operators placed this in their top three drivers, followed by:
  • improvement to customers’ network quality of experience
  • efficient planning and management of dense networks
  • part of an end-to-end automation strategy spanning the network and IT operations
    While OPEX reduction is the most important cost-related driver, others include better alignment of network costs to the revenue that is generated; and the ability to defer some capital expenditure (CAPEX) by using existing assets more efficiently.

    Obviously, the autonomous self-driving network needs to move from an O&M approach that is focused on network elements, to one based on usage scenarios. This means that process changes relate directly to a particular result, defined by the operator, and with a business value. Progress will be accelerated if a core set of scenarios is defined, which will be of value to all operators. Development of the related autonomous self driving network solutions can then be prioritized accordingly.

    The criteria for the selection of scenarios as follows:

  • Extent of digitalization: Reflects the technical readiness of the scenarios. Digitalization is the foundation of automation, and the extent to which it is supported determines the extent to which automation can be achieved immediately;
  • TCO contribution: Reflects OPEX savings and the improvement to CAPEX efficiency in the given scenario;
  • O&M life cycle: Reflects the ability to build differentiation in each phase of the life cycle in order to achieve full autonomous driving across many scenarios. The O&M life cycle spans planning, deployment, maintenance, optimization and provisioning of the network and scenarios have been identified for each one.
    Based on those three criteria, we selected six typical key scenarios for the purpose of illustration and clarification.

    Scenario 1: Base Station Deployment

    1. Definition and Description of Scenario
      The base station deployment scenario refers to the entire process after site survey, including network planning and design, site design, configuration data preparation, site installation, site commissioning and site acceptance.
    2. Automation Classification

      Level 1: The O&M tool helps some elements of the process to be automated, but configuration and site acceptance have to be done manually.

      Level 2: Some hardware can be detected and configured automatically, and configuration data is simplified based on rules.

      Level 3: E2E automation: radio parameter self-planning, hardware self-detection and self-configuration, self-acceptance without dialing test.

      Initial outcomes: Upon the usage of AI, some initial results are achieved as follows:
      -Site Deployment Time Shortened by 30%
      -Feature Deployment Time Shortened by 60%
      -Performance Converging Shortened by 85%

    Scenario 2: Network Performance Monitoring

    1. Definition and Description of Scenario
      The mobile network has entered the stage of very precise planning sites and resources: on the one hand, to identify and forecast high traffic areas, and allocate resources precisely to support business goals; on the other hand, to identify and forecast high-frequency temporary traffic, scheduling resources to meet business objectives.
    2. Automation Classification
      Level 1: Network quality is consistent, and network anomalies can be discovered by tools;
      Level 2: 3D presentation of network quality and anomalies, and network planning is self-generated;
      Level 3: E2E closed-loop monitoring and planning: predicting network development according to historical network information, finding value areas and hidden problems, recommending the best network planning and estimating the gain automatically.

    Scenario 3: Fault Analysis and Handling

    1. Definition and Description of Scenario
      The security and reliability is the most important mission of the network, so quick alarm detection and quick fault healing are important. The fault analysis and handling scenario comprises several steps, including alarm monitoring, root cause analysis, and fault remediation.
      Monitoring: Real-time monitoring of network alarm, performance, configuration, user experience, and other information.
      Analysis: By analyzing the correlation between alarms and other dimensions data, root cause of fault and fault repairing can be achieved quickly.
      Healing: Repair fault remotely or by site visiting based on the repairing suggestions.
    2. Automation Classification
      Level 1: Some tools are used to simplify alarm processing, but thresholds and alarm correlation rules are set manually based on expert experience.
      Level 2: Automatic alarm correlation and root cause analysis.
      Level 3: Closed-loop of alarms analysis and handling process: Based on the intelligent correlation analysis of multi-dimensional data, accurate location of alarm root cause, precise fault ticket dispatching, and fault self-healing could be reached successfully.
      Level 4: Proactive troubleshooting: Based on the trend analysis of alarms, performance, and network data, alarms and faults could be predicted and rectified in advance.

      Initial outcomes: Upon the usage of AI, some initial results are achieved as follows:

      -Reduction of alarms: 90%

    Scenario 4: Network Performance Improvement

    1. Definition and Description of Scenario
      Wireless networks are geographically very distributed, and activity varies significantly in different places and at different times of day. This makes the network very dynamic and complex. That complexity is further increased by the diversity of services and of terminal performance, and by the mobility of users. If the network cannot achieve the benchmark KPIs or SLAs (service level agreements), or enable good user experience, it must be adjusted to meet or exceed those requirements.

      This is the function of network performance improvement or optimization.

      The complete process of network performance improvement or optimization includes several stages:

      • network monitoring and evaluation
      • root cause analysis of performance problems
      • optimization analysis and optimization decision-making
      • optimization implementation
      • post- evaluation and verification

    2. Automation Classification
      Level 2: Drive test evaluation is not required for coverage optimization. Adjustment suggestions are provided automatically.
      Level 3: Closed-loop of network performance improvement:

      Automatic identification of network coverage and quality problems, automatic configuration of performance parameters, and automatic evaluation.

      Level 4: Dynamic adjustment is implemented based on the scenario awareness and prediction to achieve the optimal network performance. Network prediction capability is available: scenario change trends could be perceived, and network configuration could adjusted real-time to achieve optimal performance.

      Initial outcomes: Upon the usage of AI, some initial results are achieved as follows:

      -Capacity increase: 30%,

      -Delivery duration: 2 weeks, non-manual

    Scenario 5: Site Power Saving

    1. Definition and Description of Scenario
      T Site power consumption cost accounts for more than 20% of network OPEX. Although network traffic declines greatly during idle hours, equipment continues to operate, and power consumption does not dynamically adjust to the traffic level, resulting in waste. It is necessary to build the "Zero Bit, Zero Watt" capability.
    2. Automation Classification
      Level 2: Tool aided execution;
      Level 3: Power-saving closed-loop: Based on the analysis of traffic trends, self-adaptive generation of power-saving strategies, effect and closed-loop KPI feedback;
      Level 4: Real-time adjustment of power-saving strategies based on traffic prediction. Through integration with third-party space-time platforms, the operator can also add predictive perception of traffic changes, smooth out the user experience, and maximize power-saving.

      Initial outcomes: Upon the usage of AI, some initial results are achieved as follows:

      -Power saving: 10~15%

    Scenario 6: Wireless Broadband Service Provisioning

    1. Definition and Description of Scenario
      WTTx has become a foundational service for mobile operators because of its convenient installation and low cost of single bit. Rapid launch of WTTx service, accurate evaluation after launch, and network development planning have become important supports for new business development.
    2. Automation Classification
      Level 1: Blind launch;

      Level 2: Automation tools to assist the launch, check the coverage and capacity of the user's location before the business hall, and experience evaluation;

      Level 3: Closed-loop for business launch: Integrated with BOSS system to achieve one-step precise launch, remote account launching, CPE installation, fault self-diagnosis and complaint analysis;

      Level 4: Auto-balancing of multi-service, automatic value areas identification and network planning recommendation.

  • StakeholdersCommunications Service Providers, Suppliers, Industrial and consumer users
    Stakeholders'
    Assets, Values
    Systems'
    Threats &
    Vulnerabilities
    incorrect AI system use
    Key
    Performance
    Indicators (KPIs)
    Seq. No. Name Description Reference to mentioned
    use case objectives




    AI Features Task(s)All
    Method(s)Machine learning, deep learning, Knowledge graph, decision making & reasoning, analytics
    HardwareAI training and inference system, and network management system
    TopologyEnd-to-end
    Terms &
    Concepts Used
    Autonomous network, self-driving network
    Standardization
    Opportunities
    Requirements
    None
    Challenges
    & Issues
    Data usage and sharing, human expertise & competence
    Societal Concerns Description None
    SDGs to
    be achieved
    Industry, Innovation, and Infrastructure
    Data Characteristics
    Description
    Source
    Type
    Volume (size)
    Velocity
    Variety
    Variability
    (rate of change)
    Quality
    Scenario Conditions
    No. Scenario
    Name
    Scenario
    Description
    Triggering Event Pre-condition Post-Condition






    Training Scenario Name:
    Step No. Event Name of
    Process/Activity
    Primary
    Actor
    Description of
    Process/Activity
    Requirement






    Specification of training data
    Scenario Name Evaluation
    Step No. Event Name of
    Process/Activity
    Primary
    Actor
    Description of
    Process/Activity
    Requirement






    Input of Evaluation
    Output of Evaluation
    Scenario Name Execution
    Step No. Event Name of
    Process/Activity
    Primary
    Actor
    Description of
    Process/Activity
    Requirement






    Input of Execution
    Output of Execution
    Scenario Name Retraining
    Step No. Event Name of
    Process/Activity
    Primary
    Actor
    Description of
    Process/Activity
    Requirement






    Specification of retraining data
    References
    No. Type Reference Status Impact of
    use case
    Originator
    Organization
    Link








  • Peer-reviewed scientific/technical publications on AI applications (e.g. [1]).
  • Patent documents describing AI solutions (e.g. [2], [3]).
  • Technical reports or presentations by renowned AI experts (e.g. [4])
  • High quality company whitepapers and presentations
  • Publicly accessible sources with sufficient detail

    This list is not exhaustive. Other credible sources may be acceptable as well.

    Examples of credible sources:

    [1] B. Du Boulay. "Artificial Intelligence as an Effective Classroom Assistant". IEEE Intelligent Systems, V 31, p.76-81. 2016.

    [2] S. Hong. "Artificial intelligence audio apparatus and operation method thereof". N US 9,948,764, Available at: https://patents.google.com/patent/US20150120618A1/en. 2018.

    [3] M.R. Sumner, B.J. Newendorp and R.M. Orr. "Structured dictation using intelligent automated assistants". N US 9,865,280, 2018.

    [4] J. Hendler, S. Ellis, K. McGuire, N. Negedley, A. Weinstock, M. Klawonn and D. Burns. "WATSON@RPI, Technical Project Review".
    URL: https://www.slideshare.net/jahendler/watson-summer-review82013final. 2013