<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.18 (Ruby 3.0.2) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-01" category="std" consensus="true" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.22.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-01"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2024" month="August" day="07"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 64?>

<t>This document defines methodologies for benchmarking the performance of source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 68?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain and inter-domain SAV mechanisms have problems in operational overhead and accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchamarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support many SAV mechanisms. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy, convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a bare metal server, a virtual machine (VM) instance, or q container instance, which runs as a SAV device. This document is intended for those people who want to measure a SAV device's performance as well as compare the performance of various SAV devices.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing ''which SAV mechnisms performn best'' over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems's performance (also known as ''micro-benchmark'').</t>
          </li>
        </ul>
        <t>The benchmark aims to compare the SAV performance of individual devices, e.g., hardware or software routers. It will showcase the performance of various SAV mechanisms for a given device and network scenario, with the objective of deploying the appropriate SAV mechanism in their network scenario.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>Improper Block: The validation results that the packets with legitimate source addresses are blocked improperly due to inaccurate SAV rules.</t>
      <t>Improper Permit: The validation results that the packets with spoofed source addresses are permitted improperly due to inaccurate SAV rules.</t>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An intra-domain router of an AS which is connected to a host network (i.e., a layer-2 network).</t>
      <t>Customer-facing Router: An intra-domain router of an AS which is connected to an intra-domain customer network running the routing protocol (i.e., a layer-3 network).</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic and it is needed to generate the traffic with line speed to test the data plane forwarding performance.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> shows the test setup for DUT. In the test network environment, the DUT can be connected to other devices to construct various test scenarios. The Tester can be connected to the DUT directly or by other devices. The connection type between them is determined according to the benchmarking tests in <xref target="testcase-sec"/>. Besides, the Tester can generate spoofing traffic or legitimate traffic to test the SAV accuracy of DUT in the corresponding scenarios, and it can also generate traffic with line speed to test the data plane forwarding performance of the DUT. In addition, the DUT needs to support logs to record all the test results.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The location where the DUT resides in the network topology affects the accuracy of SAV mechanisms. Therefore, the benchmark <bcp14>MUST</bcp14> put the DUT into different locations in the network to test it.</t>
        <t>The device in the network topology can have various routing configurations and the generated SAV rules also depends on their configurations. The device configurations used needs to be specified as well.</t>
        <t>In addition, it is necessary to indicate the device role, such as host-facing router, customer-facing router, and AS border router in the intra-domain network, and the business relationship between ASes in the inter-domain network.</t>
        <t>The network traffic generated by Tester must specify traffic rate, the proportion of spoofing traffic and legitimate traffic, and the distribution of source addresses, when testing the data plane forwarding performance, as all may affect the testing results.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>MUST</bcp14> be measured in the bencharking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>MUST</bcp14> be measured from the result output of the DUT.</t>
      <section anchor="proportion-of-improper-blocks">
        <name>Proportion of Improper Blocks</name>
        <t>The proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="proportion-of-improper-permits">
        <name>Proportion of Improper Permits</name>
        <t>The proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the begining of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-speaking-agent-processing-throughput">
        <name>Protocol-speaking Agent Processing Throughput</name>
        <t>The protocol-speaking agent processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra-domain-sav">
        <name>Intra-domain SAV</name>
        <section anchor="sav-accuracy">
          <name>SAV Accuracy</name>
          <section anchor="objective">
            <name>Objective</name>
            <t>Measure the accuracy of the DUT to process legitimate traffic and spoofing traffic across various intra-domain network scenarios including SAV for customer or host Network, SAV for Internet-facing network, and SAV for aggregation-router-facing network, defined as the proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic and the proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic.</t>
          </section>
          <section anchor="test-scenarios">
            <name>Test Scenarios</name>
            <section anchor="sav-for-customer-or-host-network">
              <name>SAV for Customer or Host Network</name>
              <t><strong>Test Case 1</strong>:</t>
              <figure anchor="intra-domain-customer-syn">
                <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|Outbound traffic with     |    | Inbound traffic with     |
|source IP addresses       |    | destination IP addresses |
|of 10.0.0.0/15            |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-customer-syn"/> shows the case of SAV for customer or host network in intra-domain symmetric routing scenario, and the DUT performs SAV as a customer/host-facing router and connects to Router 1 to access the Internet. Network 1 is a customer/host network within the AS, connects to the DUT, and its own prefix is 10.0.0.0/15. The Tester can emulate Network 1 to advertise its prefix in the control plane and generate spoofing and legitimate traffic in the data plane. In this case, the Tester configs to make the inbound traffic destined for 10.0.0.0/15 come from the DUT. The DUT learns the route to prefix 10.0.0.0/15 from the Tester, while the Tester can send outbound traffic with source addresses in prefix 10.0.0.0/15 to the DUT, which emulates the a symmetric routing scenario between the Tester and the DUT. The IP addrsses in this test case is optional and users can use other IP addresses, and this holds true for other test cases as well.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester generates traffic using 10.0.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
              <t><strong>Test Case 2</strong>:</t>
              <figure anchor="intra-domain-customer-asyn">
                <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment               AS |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|Outbound traffic with \          / Inbound traffic with      |
|source IP addresses    \        /  destination IP addresses  |
|of 10.0.0.0/16          \      /   of 10.0.0.0/16            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \  \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-customer-asyn"/> shows the case of SAV for customer or host network in intra-domain asymmetric routing scenario, and the DUT performs SAV as a customer/host-facing router. Network 1 is a customer/host network within the AS, connects to the DUT and Router 1, respectively, and its own prefix is 10.0.0./15. The Tester can emulate Network 1 and performs its control plane and data plane functions. In this case, the Tester configs to make the inbound traffic destined for 10.1.0.0/16 come only from the DUT and the inbound traffic destined for 10.0.0.0/16 to come only from Router 1. The DUT only learns the route to prefix 10.1.0.0/16 from the Tester, while Router 1 only learns the route to the prefix 10.0.0.0/16 from Network 1. Then, the DUT and Router 1 avertise their learned prefixes to Router 2. Besides, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester can send outbound traffic with source addresses of prefix 10.0.0.0/16 to the DUT, which emulates the an asymmetric routing scenario between the Tester and the DUT.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-asyn"/> to construct the test network environment. The Tester is connected to the DUT and Router 1 and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT, Router 1, and Router 2, are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
            </section>
            <section anchor="sav-for-internet-facing-network">
              <name>SAV for Internet-facing Network</name>
              <t><strong>Test Case 1</strong>:</t>
              <figure anchor="intra-domain-internet-syn">
                <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|Outbound traffic with     |    | Inbound traffic with     |
|source IP addresses       |    | destination IP addresses |
|of 10.0.0.0/15            |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                   +--------------------+
                   |     Network 1      |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-internet-syn"/> shows the test case of SAV for Internet-facing network in intra-domain symmetric routing scenario. In this test case, the network topology is the same as <xref target="intra-domain-customer-syn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and the Internet, and the Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
              <t><strong>Test Case 2</strong>:</t>
              <figure anchor="intra-domain-internet-asyn">
                <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
                <artwork><![CDATA[
                    +---------------------+
                    |  Tester (Internet)  |
                    +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|Outbound traffic with \          / Inbound traffic with      |
|source IP addresses    \        /  destination IP addresses  |
|of 10.0.0.0/16          \      /   of 10.0.0.0/16            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \  \/
                   +--------------------+
                   |     Network 1      |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-internet-asyn"/> shows the test case of SAV for Internet-facing network in intra-domain asymmetric routing scenario. In this test case, the network topology is the same with <xref target="intra-domain-customer-asyn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and Router 2 within the same AS, as well as the Internet. The Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-asyn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
            </section>
            <section anchor="sav-for-aggregation-router-facing-network">
              <name>SAV for Aggregation-router-facing Network</name>
              <t><strong>Test Case 1</strong>:</t>
              <figure anchor="intra-domain-agg-syn">
                <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|Outbound traffic with     |    | Inbound traffic with     |
|source IP addresses       |    | destination IP addresses |
|of 10.0.0.0/15            |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-agg-syn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain symmetric routing scenario. The test network environment of <xref target="intra-domain-agg-syn"/> is the same with <xref target="intra-domain-internet-syn"/>. The Tester is connected to Router 1 to emulate the functions of Network 1 to test the SAV accuracy of the DUT facing the direction of Router 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-syn"/> to construct the test network environment. The Tester is connected to Router 1 and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester can send traffic using 10.1.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 for this test case.</t>
              <t><strong>Test Case 2</strong>:</t>
              <figure anchor="intra-domain-agg-asyn">
                <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|Outbound traffic with \          / Inbound traffic with      |
|source IP addresses    \        /  destination IP addresses  |
|of 10.0.0.0/16          \      /   of 10.0.0.0/16            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \  \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-agg-asyn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario. The test network environment of <xref target="intra-domain-agg-asyn"/> is the same with <xref target="intra-domain-internet-asyn"/>. The Tester is connected to Router 1 and Router 2 to emulate the functions of Network 1 to test the SAV accuracy of the DUT facing the direction of Router 1 and Router 2.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-asyn"/> to construct the test network environment. The Tester is connected to Router 1 and Router 2 and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 and Router 2 for this test case.</t>
            </section>
          </section>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <section anchor="intra-pcp">
            <name>Protocol Convergence Performance</name>
            <section anchor="objective-1">
              <name>Objective</name>
              <t>Measure the protocol convergence performance of the DUT when route changes happen due to network failures or operator configurations, defined as the protocol convergence time representing the time elapsed from the begining of routing change to the completion of SAV rule update.</t>
            </section>
            <section anchor="test-scenario">
              <name>Test Scenario</name>
              <figure anchor="intra-convg-perf">
                <name>Test setup for protocol convergence performance measurement.</name>
                <artwork><![CDATA[
+-------------+          +-----------+
|   Tester    |<-------->|    DUT    |
+-------------+          +-----------+
]]></artwork>
              </figure>
              <t><strong>Test Case</strong>:</t>
              <t><xref target="intra-convg-perf"/> shows the test setup for protocol convergence performance measurement. The protocol convergence process of the DUT to update SAV rules launches when the route changes happen. Route changes is the cause of updating SAV rules and may be because of network failures or operator configurations. Therefore, in <xref target="intra-convg-perf"/>, the Tester is direclty connects to the DUT and emulates the route changes to launch the convergence process of the DUT by adding or withdrawing the prefixes.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test the protocol convergence time of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
                </li>
                <li>
                  <t>Then, the Tester proactively withdraws the prefixes in a certern percentage of the overall prefixes supported by the DUT, such as 10%, 20%, ..., 100%.</t>
                </li>
                <li>
                  <t>Finally, the protocol convergence time is calculated according to the logs of the DUT about the beginning and completion of the protocol convergence.</t>
                </li>
              </ol>
              <t><strong>Measurements</strong>: The logs of the DUT records the begining time of the protocol convergence process and its completion time, and the protocol convergence time is calculated by subtracting the beggining time from the completion time of the protocol convergence process.</t>
            </section>
          </section>
          <section anchor="intra-cpp">
            <name>Protocol-speaking Agent Performance</name>
            <section anchor="objective-2">
              <name>Objective</name>
              <t>Measure the protocol-speaking agent performance of the DUT during the convergence process of the control plane protocol, defined as the protocol-speaking agent processing throughput representing the overall size of protocol messages per second.</t>
            </section>
            <section anchor="test-scenario-1">
              <name>Test Scenario</name>
              <t><strong>Test Case</strong>:</t>
              <t>The test of the protocol-speaking agent performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The protocol-speaking agent performance measures the protocol-speaking agent processing throughput to process the protocol messages. Therefore, the Tester can vary the rate for sending protocol messages, such as from 10% to 100% of the overall link capacity between the Tester and the DUT. Then, the DUT records the size of the processed total protocol messages and processing time.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test the protocol-speaking agent processing throughput of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
                </li>
                <li>
                  <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
                </li>
                <li>
                  <t>Finally, the protocol-speaking agent processing throughput is calculated according to the logs of the DUT about the overall size of the protocol messages and the overall processing time.</t>
                </li>
              </ol>
              <t><strong>Measurements</strong>: The logs of the DUT records the overall size of the protocol messages and the overall processing time, and the protocol-speaking agent processing throughput is calculated by dividing the overall size of the protocol messages by the overall processing time.</t>
            </section>
          </section>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <section anchor="intra-dpsavtrp">
            <name>Data Plane SAV Table Refreshing Performance</name>
            <section anchor="objective-3">
              <name>Objective</name>
              <t>Measure the data plane SAV table refreshing performance of the DUT when updating the SAV table on the control plane according to the new updated SAV rules, defined as the data plane SAV table refreshing rate representing the rate at which a DUT updates its SAV table with new SAV rules.</t>
            </section>
            <section anchor="test-scenario-2">
              <name>Test Scenario</name>
              <t><strong>Test Case</strong>:</t>
              <t>The test of the data plane SAV table refreshing performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The data plane SAV table refreshing performance measures the rate at which a DUT updates its SAV table with new SAV rules. Therefore, the Tester can vary the rate for sending protocol messages, such as from 10% to 100% of the overall link capacity between the Tester and the DUT, which will affect the proportions of updated SAV rules, and as a result, affect the proportions of the entries of SAV table. Then, the DUT records the overall number of updated SAV table entries and refreshing time.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test the data plane SAV table refreshing rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
                </li>
                <li>
                  <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
                </li>
                <li>
                  <t>Finally, the data plane SAV table refreshing rate is calculated according to the logs of the DUT about the overall number of updated SAV table entries and the overall refreshing time.</t>
                </li>
              </ol>
              <t><strong>Measurements</strong>: The logs of the DUT records the overall number of updated SAV table entries and the overall refreshing time, and the data plane SAV table refreshing rate is calculated by dividing the overall number of updated SAV table entries by the overall refreshing time.</t>
            </section>
          </section>
          <section anchor="intra-dpfp">
            <name>Data Plane Forwarding Performance</name>
            <t><strong>Test Case</strong>:</t>
            <t>The test of the data plane forwarding performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The Tester needs to send the traffic which include spoofing and legitimate traffic at the rate of the overall link capacity between the Tester and the DUT, and the DUT build a SAV table with occupying the overall allocated storage space. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1. The DUT records the overall size of the forwarded packets and the overall forwarding time.</t>
            <t><strong>Procedure</strong>:</t>
            <ol spacing="normal" type="1"><li>
                <t>First, in order to test the data plane forwarding rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
              </li>
              <li>
                <t>Then, the Tester proactively sends the data plane traffic including spoofing and legitimate traffic to the DUT at the rate of the overall link capacity between the Tester and the DUT. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
              </li>
              <li>
                <t>Finally, the data plane forwarding rate is calculated according to the logs of the DUT about the overall size of the forwarded traffic and the overall forwarding time.</t>
              </li>
            </ol>
            <t><strong>Measurements</strong>: The logs of the DUT records the overall size of the forwarded traffic and the overall forwarding time, and the data plane forwarding rate is calculated by dividing the overall size of the forwarded traffic by the overall forwarding time.</t>
          </section>
        </section>
      </section>
      <section anchor="inter-domain-sav">
        <name>Inter-domain SAV</name>
        <section anchor="sav-accuracy-1">
          <name>SAV Accuracy</name>
          <section anchor="objective-4">
            <name>Objective</name>
            <t>Measure the accuracy of the DUT to process legitimate traffic and spoofing traffic across various inter-domain network scenarios including SAV for customer-facing ASes and SAV for provider/peer-facing ASes, defined as the proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic and the proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic.</t>
          </section>
          <section anchor="test-scenario-3">
            <name>Test Scenario</name>
            <section anchor="sav-for-customer-facing-ases">
              <name>SAV for Customer-facing ASes</name>
              <t><strong>Test case 1</strong>:</t>
              <figure anchor="inter-customer-syn">
                <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
              </figure>
              <t><xref target="inter-customer-syn"/> presents a test case of SAV for customer-facing ASes in inter-domain symmetric routing scenario. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively, and then AS 2 further propagates the route for prefix P1 and P6 to the DUT. Consequently, the DUT can learn the route for prefixes P1 and P6 from AS 1 and AS 2. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;AS 4, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
              <t>Procedure:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain symmetric routing scenario, a testbed can be built as shown in <xref target="inter-customer-syn"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
              <t><strong>Test case 2</strong>:</t>
              <figure anchor="inter-customer-lpp">
                <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
              </figure>
              <t><xref target="inter-customer-lpp"/> presents a test case of SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT configuration. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 to AS 2 and adds the NO_EXPORT community attribute to the BGP advertisement sent to AS 2, preventing AS 2 from further propagating the route for prefix P1 to the DUT.  Similarly, AS 1 adds the NO_EXPORT community attribute to the BGP advertisement sent to the DUT, resulting in the DUT not propagating the route for prefix P6 to AS 3. Consequently, the DUT only learns the route for prefix P1 from AS 1 in this scenario. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT, a testbed can be built as shown in <xref target="inter-customer-lpp"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
              <t><strong>Test case 3</strong>:</t>
              <figure anchor="inter-customer-dsr">
                <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           | NO_EXPORT   \           \       |
|           P1[AS 1] \          |              \           \      |
|           NO_EXPORT \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
              </figure>
              <t><xref target="inter-customer-dsr"/> presents a test case of SAV for customer-facing ASes in the scenario of direct server return (DSR). In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to the anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3.  The anycast servers in AS 3 receive the requests and tunnel them to the edge servers in AS 1.  Finally, the edge servers send the content to the users with source addresses in prefix P3. The reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2. The Tester sends the traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2.</t>
              <t>Procedure:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of DSR, a testbed can be built as shown in <xref target="inter-customer-dsr"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of DSR.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P3 as source addresses and P2 as destination addresses (legitimate traffic) to AS 2 via the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can permit the legitimate traffic with source addresses in P3 from the direction of AS 1 for this test case.</t>
              <t><strong>Test case 4</strong>:</t>
              <figure anchor="inter-customer-reflect">
                <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
                <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="inter-customer-reflect"/> depicts the test case of SAV for customer-facing ASes in the scenario of reflection attacks. In this test case, the reflection attack by source address spoofing takes place within DUT's customer cone, where the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P5) that are designed to respond to such requests. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-reflect"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="inter-customer-reflect"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P5 as destination addresses (spoofing traffic) to AS 5 via the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 2 for this test case.</t>
              <t><strong>Test case 5</strong>:</t>
              <figure anchor="inter-customer-direct">
                <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
                <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="inter-customer-direct"/> presents the test case of SAV for customer-facing ASes in the scenario of direct attacks. In this test case, the direct attack by source address spoofing takes place within the DUT's customer cone, where the attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), overwhelming its network resources. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-direct"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="inter-customer-direct"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P5 as source addresses and P1 as destination addresses (spoofing traffic) to AS 1 via the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P5 from the direction of AS 2 for this test case.</t>
            </section>
            <section anchor="sav-for-providerpeer-facing-ases">
              <name>SAV for Provider/Peer-facing ASes</name>
              <t><strong>Test case 1</strong>:</t>
              <figure anchor="reflection-attack-p">
                <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
                <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="reflection-attack-p"/> depicts the test case of SAV for provider-facing ASes in the scenario of reflection attacks. In this test case, the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P2) that respond to such requests. The Tester performs the source address spoofing function as an attacker. The servers then send overwhelming responses back to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the commercial relationships between ASes. AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="reflection-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P2 as destination addresses (spoofing traffic) to AS 2 via AS 3 and the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 3 for this test case.</t>
              <t><strong>Test case 2</strong>:</t>
              <figure anchor="direct-attack-p">
                <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
                <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="direct-attack-p"/> showcases a testcase of SAV for provider-facing ASes in the scenario of direct attacks. In this test case, the attacker spoofs another source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources.  The arrows in <xref target="direct-attack-p"/> represent the commercial relationships between ASes.  AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="direct-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P2 as source addresses and P1 as destination addresses (spoofing traffic) to AS1 via AS 3 and the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P2 from the direction of AS 3 for this test case.</t>
            </section>
          </section>
        </section>
        <section anchor="control-plane-performance-1">
          <name>Control Plane Performance</name>
          <section anchor="protocol-convergence-performance">
            <name>Protocol Convergence Performance</name>
            <t>The test setup, procedure, and measures can refer to <xref target="intra-pcp"/>.</t>
          </section>
          <section anchor="protocol-speaking-agent-performance">
            <name>Protocol-speaking Agent Performance</name>
            <t>The test setup, procedure, and measures can refer to <xref target="intra-cpp"/>.</t>
          </section>
        </section>
        <section anchor="data-plane-performance-1">
          <name>Data Plane Performance</name>
          <section anchor="data-plane-sav-table-refreshing-performance">
            <name>Data Plane SAV Table Refreshing Performance</name>
            <t>The test setup, procedure, and measures can refer to <xref target="intra-dpsavtrp"/>.</t>
          </section>
          <section anchor="data-plane-forwarding-performance">
            <name>Data Plane Forwarding Performance</name>
            <t>The test setup, procedure, and measures can refer to <xref target="intra-dpfp"/>.</t>
          </section>
        </section>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test has a reporting format that contains some global and identical reporting components, and some individual components that are specific to individual tests. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be reflected in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Device CPU load</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests described in this document are limited to the performance characterization of SAV devices in a lab environment with isolated networks.</t>
      <t>The benchmarking network topology will be an independent test setup and <bcp14>MUST NOT</bcp14> be connected to devices that may forward the test traffic into a production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 991?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares etc. for their valuable comments on this document.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+1d63LbRrL+ryq9A45dW5ZiirIoO5uostlVZCdxrS9cS86e
PevU1hAckYhBgMFFCmM5z3Ke5TzZ6e65YAYYgCAJyXYibq0jgpienp6e7v56
bnt7e9tbacai8X9YGEf8yMuSnG9vBfOE/kyzwYMHXz4YbG/5LDvy0mzs3fVO
ptx/C8Xy0SxI0yCOssUcSj59cvbt9hZLODvyvuMRT1jo/fvVk+Gz45MnP25v
XU7UK9tb49iP2AzKjBN2nu35Ux7tjWaXk72UXUQ8w//sjXjkT2cseRtEk70H
B1gsC7IQCn1j/OI959k0HsdhPFl453HincZ54nPveDxOeJp6P7AwGLMMmATW
RqOEXxx5p8c/EIntrZBFwBWPkDjLgVBytL215wVReuRtb3meYPJZgE2O8EGc
wPv/M42jySRnkZ9H3jM2ihOWxckCf/eDbIEMBj8Bb/QgzqMsgWcn0yBiILw8
5d7Zs8feDv/F5/PMe/33XaCq3qMasRyfsSA88sIAZfO3Xyd+yEZ9Ps77fuTg
8DEDRgLN4FkKtU9z5r2OgguepMDUdTCXxSjb6G+ZrK6ev2fBKEAO8w8jwzwc
LRXhM2AFRD3x/hF8kJ7+OYhCv8zl9lYUJzNQ3wt+hO+++vbk8M8PHqq/vzD+
Hjx6SH/D2I3OrUIBVMf2xjHUE+3NU3rmeRlLJhyG9DTL4Nn+PgwSBu/5b3nS
D3h23gcB7MM43RdDFB+p0WkTTOJRyGd7YEQyPuNRti/pi7FaOxyBLe+pQch7
wbPLOHmbet+xuXccsXCRBmnPGwr63qmi3/PAWnmv+M95kNCD1BM1Al2ocPBg
8FC2micdt9oguFmrNaFraHXRNSzxpx33NpIMMu5necLtJlt9Wd/+HTC+L56c
7XrHBqXlHbh+Uy5zZwc2NqToni4aAsZmb89joxS5zHBQn02D1AMWc+xIb8zP
g4in3kx7sgC+oS8zXaCXTbk35wmN7Qh4is+9VHDHJHcXNne7QNGfsihIZ2kf
nZ7x3QM37eVZEAa/8jEYcm9CDjvj9F6Sh8ABPJ2Dv0QWSxWl8zgGnidCKafs
ggOrHJR7Ng9JOYHmZZBNPeB0AWwlQZxDg3kaTKIUh0CcjHlCFYgGUa3wHNvo
xwlUMo+jMbY69XmE5aEFttQy9hZ5hAJsDqOR+VOUiB9HaQC0sSgjqmN+EQDr
UNcIXvVGIWiKN4p/6Qm5K8nOjDACGGHIeoQ8TlnmQb1sEsVpFvhISLxfiNZm
DJi5ABZSpEFEqStnnKV5UtOP/JcAaCPLIM6IXwpxKGFSj6Z9VBxUpFkwHocc
v92lQRePc1/EN9tbp0sUAvjEPgjOA59FWbjASuIEwr+sRW/3vZdzTg5QqE+a
TyY8zYT+jPk8jBfeODg/5wkSKenbu3fSd71/L/7+Qvw9YikQiKnrg0SSITlG
wjJCdHYRJHFENq8PLUa+AmxTr5CbaaNIiIE1iG1WSF+lARfaSM0CihCwxhAv
TTkbExXm+zmM2YWXR6ivSpG1TkJLSs6VWldyPe/f923r6GIQTIln2qQKbfyx
Sl0+xe6AFs3jVI7mHDSQNE0MOlQy1KuV5AR00ji8EHSUvFDbuW2YzKEjbNlY
jGVzVFwGYehNeTiX0kYlIrOT4TiJgX8hbLBAwDpDhrFaZMkcLJcQolmqAnWJ
EZ6CojBgkYpg06hGFqaxqBY0eywrzUBrXcOwOupo6KrqqJZ+1X7rAU82FExE
jSFnSmCWnE0mQLgxyGEa8Iui9xJo4nke+UJBIers2ZZtxhYwFOc4joW9tbux
bJ6UhUyX2seET1gCpgZMgOwNqUykOJachFagXNN6FQCTg3XkUke1DtmMEIbD
LpRjr4ccw6AE4fpWjwnnAz+CCQy9OUA5Tk8wHpBf4dVLaALK3JbylKeKXcmV
/F3YIgY2Ihlf4qhK4hxGCIp8hF+ha8FIpDy5EA/BNGU5PJlhr0GNOz8830Vk
kQkOoS0/E4swvMCAFD9cTgHYgZ8Fd8jsnij3V5DSIAUDNJbaCIMc2I1B/kAm
9i6l9RYehlvE7qWWjkNVlxxHBTQ7ns2xQY5hoMxcQUZo/d273ncxtBWFfOrD
MBZDocEagOxCWxcwJINm+KAEKVn9y9iLRz+BzQO0gkH69tZn3nEKvyKm9O7d
E4JSKi0Mk+QX+y3N7t0jmw3NTsGWAPvYwj2lhEX4IEg/t9wwKU8wymmoYzyV
j/bSBbg0Yf3wFaTNQGTIgfypJNQdsjFvo/gyQsHeuzcL/CQu0hf37u32K5Ly
WCDqMPuhbOyAowCiIDAuqGKyK8Dr9Sf9XqGhoBNpfJ4Z2opuUlrcdBpf+ixd
2s+G2UclY94E+iNSY1KEJcIdK4n2RIRHMlIdiFSFWVYSpthsngQqsizMXqDs
eJmw0jUL6wA6Bxg+0Sr3li88KDVOvTvPX5+e3emJ/3ovXtLfr5784/XTV08e
49+n3x8/e6b/2JJvnH7/8vWzx8VfRcmTl8+fP3nxWBSGp571aOvO8+N/3RG2
587L4dnTly+On92pGjvqU7Kp5GIhtMJAiaVbYEN90DkxKL45Gf7f/x48BK/+
XwjhDw6+JBePX744+DPGR+jwRG1xBPGa+AqCW2yBaDlLKFiFjvbZPADTBOoB
Ooi9HoHbS3h/a+uzf6Nkfjzyvhr584OHX8sH2GDroZKZ9ZBkVn1SKSyE6Hjk
qEZL03pekrTN7/G/rO9K7sbDr/6KlsbbO/jir19vieD4jCezICJLhA+ezlAV
wVJ8E8b+2yPyWEaADCFXHmapCPhptCCkhAek5iGfQMw5QzW2o2MuYuER0uTk
FqkS6KpxThoQRDqw0dCqb/EzRD6zFRmioBwNnIubOVHMVuMHv51IbzpE9yk4
wse2k6UIAl0njHag7ot6g8gPc3K1EwbcJgrOgIGb5RHgjUyGPnsJDxnxpvJU
ED8oBh6j8y7Vbjj0FOJGifoKnIryIMiWzrmP0EaUALidC1CObEjBYpiIUBG4
jWcUF5BQBYLJSGEoyMQWSHwKkkQbOw5SH0yuF2R9D7n9HuDg3jnzkcgrMrtH
3nFkR9jCHKOcIMg4PpVeP0DvC+DSl9AJog2gpe3gTtDnfYwtQraA2HygfhBe
5CQHGczgeTdVl0r5krpmBiIUDZGRJIksibPYB3UocXpocyqGIDTMyNBL006P
T3mWz5U9p5A8xSdo0CZy9iAQcUoYMEIQoPYEHTHViaAKCz4WHuo1ATSiu/P4
9dmuQ8j4I8YJaEqpe1UbFX4gbxwJtaEGqxcyAFYyMSDgtjDe794h1+hd91Lu
K4ZkNZgy8ECy56iPMrkSJ1amRZOXb1EDNcaU4xoDWo4gVxiFYpybWRQ+y3FI
EdMKraPOGlZLVkKRAfCmLCg1ehpjSKlqJhyZUM6B4n7JHQX9VDbifFxKGmHF
VjPIGMN4FO9pwLU8NkeF+E1/RDrt/m+1n/vijSvR9TKZ6j0pUgbelXqDKO1Z
n/v00HpD/Ks/V+oNKPm19QZomfkGUtveulpCR9Aq3nLzU7zV1PLirSUf6y1n
jZW3nNwXkpAfeiz1HX/7SlMVEnXTMn8rc2P3/bsj7y6BNLILlJ79y53CdvTv
vEdtEcOQ3oF4CUMf4SAMk4IhLfQXZY/0T478EkVW1LMSFVomRFgNp7VQY0fU
aaYstUFwUVS1jSHO9TEdh6mChV2RICKLoQnAyVaglF1ykQqZ4ZjUzouSVrEY
V7IGO4tMoNdpvb7hmBhIhRAMrvUo16ZFDXWnibEGvInk0Zpga5vzvD1labBq
glaFlenCwqhchtKHIpuoegPtm8h+yawK2H76Dp0EkqVwW2uRjNIUaFEW6Ey5
DGyM9FEQXJ0Hk1ykGj3l+SBwFA8uMV7XTCSiM5SwKq4IJAH6IBPghoCryR+g
Cm3nPVsXPMIA8zzTNYJzi438reLLwYJoOQRCqg0SJtbxij1JSVc1TFQk4ZsS
SUlWSEB1+NgI8kgTAFvyaJwW6WKbgBgqkpkSbco66Z4dcRUsEiijnIEIyk2F
UO4OI1yWLET0PMZQVmXnqCaIjEG6aQ7xFZCaGnGhSh75paBNJ5WgwRCbjYQj
lyGblKIVl0mR9rSIRnmKc0YgSgymsYXTYK6twvFpoTlWhlfS0R1XDkEKyYMZ
kiZglqNNI2kt9Iv4Uk8lhmW4QBmUsolAhqs2omgIhNV2CqaEZ3oy5ctFor/V
ECcMjKMUU6NioOgBS+I3xyzp2NCwD09FF8dJqlO9qTS9IcEezD2YBiXQBbyd
vw+fprtqHKLbUfmjqhHue8fwHAuIsTjiKoU3Vn0n0saiTDHpYGUQqmb8GEaK
6Bo37fMknol4nsSASTo0A4ZZlLZsaPWsDZ9TpUF29zu8gQYeDogMSqasD/OT
OE21ba1XmUCkbMGohapfy06mXTsE7K5pSEWPdTOc2Lq+IWVCGzVDwK4TIx9+
Fsx0TkyjdPWimTgHYWKPz6HTKaEm84FBDPhCJERFCxUjNuSvkCQrms/HyhQa
aByHqzbw4Ifw5Skmq6KeASJoPCJPYL/mqamWI+h6wpwgApuOCmgIE3LVVapq
yY7Mn4Puk4Atg10zk2BNNsmWlqW+BxaQ0UA8nqB7HIrEBz44mwKbkykMIkOX
SoXYRM4UqUKZLqRGppRJ8bxIr+hpY5kBOqdlRC3yKtJT2r1pRViWfBwd7w6c
pHCKbA0VPaP5t1f8HBozpeQEENYxQmG18d2M3k2KdynCg+880bl3esQyqZmM
hpfoZPBuWWrQoYhQTZyTHlqNLI8yg5fWDfy2cDU1zTKcEXFu9WupWuNdo8ex
X0tdbpSxLEhd97Vsmr2G8oxAwbu7liuRIrDX9Rz/IB4Lt3ksrZV8dtd7qWYC
8MlzOSFVDlGVpaQVB9RYl9/ARlajCWFcVRzpipEMR1mkJJU31vkt+JuSbi9U
YKXeoNU/uFpIhmpW5KVeYpMJTo7iCNsTUVvldTUBxVJHlHSNblIHVjfkz/q6
7wUwV8JXj+9qoZ0Ysv/ekD1NzH1GpU9wpurgs8+OvHIaqCERsuxz35kkqc0X
NSVP7I/B0/2aVxpKX8nUrXdQzpDYpb99+g1acZVuWrHux9hM9XnBf8n+M43n
8Of+G8FD7YdKHzzo0//2Dx55WlgHquCy0vW/4j9v9tcpbSSM1pK5zts1yvw6
6m4p85d5NopzHMNmskMXvAIDVfc7lJbo6enQSBR7Zmkzl2y9haXBTlg9bnAm
/ql94WrTEVovFKUtzndK+UNH9tEgI/HsjlbkXTsZadW4YzR0VzZxBQaKxok0
prWUS6cD0kWk0pq1Hkq5taA0VZMuZjOe4XIjFSTrCXSVF62t1MqT0vIAA7F2
wEEB73G0yThExGu02kVVsV9NlqhFPZHIbsWFlcRJHJ+iBSSs/HTfsEtBhbbm
Xc4RYsnj055VgWRThVWph3PngJPOg1+QoqEHlXyumncpWEAmx4AxsgCEisQU
IUccThVWk6vujImiUMR3MptNQDLldtaWcl/Uuhl7y2UeyLYbwhLIRUXmqAZg
wQs0RmnSM9mTIWdJlOrJQC4XjWIDTQq6sGCHVjuFvJxWBhw6xuSDw5xVJrWD
yFWP2XkinpEdItOiDRpqJtDNmUGrydJAKg5I1pT6pCEDX+K5XDWKRfMUoQu2
DDd+iBy+aWIN5D+NQ8xDJrlY7ibe1ZRTKyP52WcENQGkcwiM8MlB3/s2SNKs
Z80BUnE9c21MY2gNq87768V23Y174gNzU3K+Y5QHYVasSaGMVZNdqkzC1s3V
lGdbnRMrKHFtfvChWkhJMtbDVszoD4iknAhQ0zwFhjBpaqOEqy1UpllUTcvZ
KVKuN9BU3SF2IyhPuLDGruquVA+JnMCgqfgsrQ6RnarN2BUaVyYzaCRTDvB3
rWGGMzYC4YUL0QOUYHdCDCjnsGOoF4DfFkXGnMzFwdGXWODLowOp9U9+mYvu
fCWytYgKzgy1JpzkhCSq13HhRg1IoioLqy3WVZqjW3SRCUsGcvSZnn2DmOe3
OmDSFpkcn26ITbx28GSwNM5vhVDkS3rgmASaQQoEf/hPzUsKpxwIpf7cwika
YiCFYgBZL5WADvymG16UJ7Ri1VK85BBiBSk0doeDQAWoNKLF6+BAfFQPiC/O
VzQBN3J5Y5avxS5N4OVNUb4evlTxy+dFxW8K/ute2RTC/LYMxbz5vWIYtgmI
YWujGNYZjGngYQMc0xkuseIN2wEvgSztEIsVIiGt5h0VKn7qGH1ow0rog5Y6
mxBE90NLFPO5XFpvklIyLOAM/daMaTRfNZhGW+VaWiIXW8Iun5ciEDPyrMaY
Ck+KNQ9UCbRYEOUmSB6UFvHUYbYKI6p8z665puyBs2xF11YBeDTXVRHSMoDX
OHiXQbwquPI+BnTVbI7Wh1fsuvBVoafdg62eYfmMulBP64FXk1NZE3lpjW8J
mZy460EjERd8++MiL3vqqDwp1zBz5Fo/3CbeqY24dMilmNhdKY6qoas+FGDX
JPWljixPjl8ZsbCMcBXG33zmrH5hta6/hrWlc0ANU0DXOwcksdWVUDChU6p7
vWZU27Lu3928m6EVq5duO9d5HXXfzrs1je96oXQ374YfU4m9OgN63Zg1UG7E
AVlr1n1sPO1m1lndnlAGrR1woXGZrqDnXpUtF+GlbEYbwxvT8uZ6XbFA3OeK
gF7AbiztqVkJ3istci+HlVYoac7yFfUXAak6R8Dc/FTMCqqARQekZdGKE2eA
GBvTwTEWlJeMG++qhPBN4YXNNWEdtFDS1pucjLG67+OZjtFAdsXZmJaYoHku
5mPBBDeDCkwNqAUGyydlXL5spdB8lZj/9xf0/7Zp3O9tGvo3E2gRgTcTsABA
MwH3pFWJA/ulgUXgI5vZKsBMw8xW8ZJrXsnQEveDihDL0i9BgcaJxuvgQArg
dmZrFXtQL6CuZrbw83GhBNfMVtuwbOWJLavW7oBCYy50HaQgT2FoyjF/CLCg
bYgxoUb84qyacexVFSTcAoprnX8o6/UnACnqJh2uY87hFl58bPDCnnk4rt3p
s8IcxEe5eWV5GL1GEt2IlG+T6JXPbRL9NolufD7NhV9sMnHlz5duidw4ky4r
Xh4bd8qKPtjWFaNgrbVcLoua7VxrY7hj7oExo9Mi2gFGrE0otQfwqMhJSkRE
6Yk83wJ+16ukbnTvwYfJdhd91U1UuvJKmE8hzX3wwTYduBc9fuprX2pH3AYp
7w+5D8G7wWytK9JEsQ9us7W32VpF4DZb+0fah4A+3JWoXT0IXDllq6q+joi0
MZ+0RkjKVoxJ2QpBqZUju7kI1ar3ZsPVTvtz3ci104SquzO7j2NvMrf6Ua7n
/qMFtVYnN2Zd7cPnzSMZ9RFDziPwzLMb30nnMPfn74tkbu2pVM6T8tzHZhXn
2vHSqXbqYH015M5ZENKxX7izX951VDqN1HU81LIj+9Qgup5D8wphWcc4VYCG
7ZvdvlviCdfB0F8b4bx9lnQzPXcIgMKa7GF/WYdDF2c+L+1feUQbmUfh4w2o
JX2IMr1FbU3HTK9UJY1Cdwl5MJp9XJo8cLFwSiHL8dbSVN+L5dTQvhiB+mmg
dkzmIkwhsmoKUx63C8MWT09FZ8T1iyuouHXysOHATCla1hoPsUYLEmaL2m2Q
1u4vu6VoIUkYUtkbRTla0Cm/ER2Sj3HQOGGX+pBFua9ujQ1izUO5YGAll2/p
3dreXh8x7ppHrTpwWZAuVhQuSssptaQkLkv0eYKBIyq6j3eCTXRr1fm3+n15
qLY4YljLQzmwgwd/6nkD/Kff7/fg64M/9V0uvl7KtCM19HNxFmblMHQ6y9tQ
BQZIMCtMaKRO4LHNZV2VUkeeF2NaO9dyReLw8NS21qZeNNoBtdHXYAvLFkst
2gpkhJfFjTK8iVQpPHBjsqN9SqmuNnz2K666cm6qw1v785W8deVUVbfDHlv3
fNWZA/c5t7UOut2JrhWnrUZBGvzK5bmuQoYzPFoc7ReeSJyCikTjJl9c9U4a
C5Z6p0lIdPmaxoGGA2u0Pba3aqJvHXy6muSMM0EtXVNyqhxob+SvL+iM9qk8
NxadMWa0sYIKmXLE/OBPYmsz/Ldkt8IgeouXazG877vNqVHGLm5zyKuel62i
S5PQBON1glVtoLDbEBAMv9UR7urS/8Q9VEqXAzj1xowkWnqsVXp+fQfWrmPW
dmplw+MWjmpJ4ayrqreOn+uk9qqPW0dmI7yb+CIY11lkN2syRGmSy137mGgX
bF12TLbLJ47nKbvIklaOcdnR2k2IVsf+KhUnSrvODa/qHZ64LTCJcVVHxXe2
PPm75DI3Ofx7Mxe6ijg3caWr1GO51I1E8zE7UHXKh7i6ubi4o3QlmUPl6LZu
PGZHXGrRayiNjzje+yqOHdHCavLeqjFRPhvxpMyDkLWiiawYvbi+7241bm5d
9o257Fb9sbGnbqtiZhmXum3irzvgwdgEsLrY6px1G75KLtslmrJTNq52cLvi
83lNbrDBddTcPraJw5BKXNxPxqWIS4f606TL8hOF5bWyphlZbyCZh6ShyRnL
O8gNHxT7fj5flPsT/o/7QfAa2yxOcECnUCW/rtkQPemxLEaVfYeHbMkbV8ra
bt7f0YWBL98d8rsx69X7S4xJwWUaaqZ/u1HW65xoa/AX5e7tFM8V2mrO2i3X
1i7Q3Mp1O/1Cs3jaQLcqHyU/4BKBut+muB/v47vfpnJ3X6v7bdRyBLoI0Ly5
BhgCSfJkf87tt/7wN9bUX1hjiskIA/xPabtP9Sbhyuq7ptL0w/Gpd7gzPNzV
D9rXfX//jax1/839Fev2rPWPTu4aS5vLBV3Fm7e9WN8cxZtK73s7J4PhbkPx
amnHIrj7dcWrpeVXGAU7w4e75qNqcUfd2DtQH/0r/lv01JtlpbWw1HOrhF28
KD0c/Bv0avCjlrSbX+tLtW7dTcXz2uKV0kU3XZklho+Qs0c/2l+wdM1YMqnW
1I6l1Vga7AwHxli68oYHWMfBj7XFS3WbYwlKf65Lu4tX2m2w5dX8UHzRpXU9
BVt2aVdxo78PSsUrWlQtbnMuOkuxRV/3h/jI+qH4Uml3vSl0/FIpfUWdd7Az
POiBLHaNBqiOfbQzlMtcu66bPnItdJO99KpSs5hsV7rzTXGi0ury44rNqxQX
JYsVRdUVxC4aBT961RB48za3L1nxk1jLWQRi7dYJl2oCPKbvXmXuVcKbVu46
zwFVlf4d0L+HOr2pL59+VIBEY0GTAyUuOQwdb53uU332eefYQqzfThnQk8p7
pTSwi86hwXjl91GcTeklG/IRU/paJnUnE/wxFOsTh3TSdJXJ6lnr2ZRu18al
jHmSTQXqnbNJaVGSiLTpOGurCs3RCYic/5yDVBVYVGsx6chtJymLX8Kfol1C
GoPa4zyMiHvOQD5iESa3Nz+4blyStZl7HezfH2IPIBN7XyMH9O9D11F9lTyC
5lzNOtXqv3lOiGgaFyskdK7lRo7tXnNMtk/eVM1FNwu7taSN1dCKVjXN4jib
W7ZYIc02FqXxdO41F3MXGSV7HTYqqWP5NQ2Th/iTW32XnH3hXQTMMAh2je49
jEtrbLpYSdf3ezx0w1oYLoznsp2O/jXsdLzF1s7PLbZ2Fr/F1qWab7F1C2xd
BcdQ+sXL/zz57+HLV2d1xQtsXQbHK2Hrop4usfVu9YdbbH2LrQt+nNg6nM/X
wtZN9/nQNhSaDtKaXge4ofquAPeKHNnbYG4x+TqY3ATjEEALEZgSns3yCOd6
WQb9MjIu2vrmu2FBlMI4VABFsIe1XMg1hiIQxRi1DOX1CkQHljehkncazIKQ
JYhUZHO64dVMQEBYjvzIIy2xl6M4a8GqSmkc1iUb3BeW2a0tsgzq+uOlh4fe
YLbBWntyvcmGm78nrEubtF4WQtjQ32kWYt095bdpiD9cGuKw8wOXZBS3QTai
CCSXBpvV4K0FkeNoAW3PvFOegIO6L8o05SjacQKoSoHftpw4PgqQOvMVzUSG
hwiwDn+0sh764YqcbJT80DTMNMZqOZC6nm2TCil+on+NpIYzw7CMEzO5YaU2
mhMjEi4LQ6465apacnl+pJzqcLWiLk1S+TSmS5ZkSwoasmPNrEmR9jD/LojU
jlUreeL8G4m8Ts2xWsqB2FkQNxUnJ7JXCyKfO1Mpb6pE8OPIiJRzIi4qJcFW
EyONixbKeRXxqeZHmvMrpfRK5fOm6GI9iM08ifF3HZF64+z4pY7IVZEzOaS8
yUpZky45oc9mORSL9etOpRS+uKkK/KfmzLQ2+RWrQU1pFv3izpPxhEvfW3/z
kLPq7a1/Fhcg1GcNhofqEBAmPb0EfXTOQCrAoQanhCYIwsvtlYhd+0tzQOM0
aZsDothOIRgIz0SsBtE2igBi0SxPIm/n8emr3bqMD1S2QcZnhfpvMzo1GZ1/
0nZSEBlJlKqmDSoJ5h7S4jAXpXH2GYo9cxk3bZrB7IFA/ALrE+wHWEbQQBER
PaRqPMT16hwwi8hoqIqJzRzgKC0InilGOI4ymwCAehv2We/o/Ta4LdbI1YhG
12Y2VD7lUOIojsTq2npQtNW+Ub2KPOtTKYdLUimgJmGsjrthojetij/8Aovy
iISxt14aQxiG32kaoySh9VIXh/WJhMHqqQsa+JhHUFJoi/KbYXyTstdC/BZn
KpOHeLjsGsGNYoxyROAKaloD/3Iw0BAhLUX+qxBToWTjKoUVOCtNh29ADD8S
sbkzAisSsw5r3pCYkSJwUluFmJUqqMk3LCNWoxLuvMMyYnWpA2f+YSlndSkE
Vx7CQaxlKsGRj6jjrEVKoZqXqCOmPk2phUp+Ygmx5hSDtcjCPmrxPpaosw7u
VIOF3gm4KSRztXclj3V0cGs+d1NDYjvHWYb7bBNArleFqSknMKxKnNQEZwCF
79GKk6+u9vbqOTOWhDipNcpMGi4tM3OFiItao9I2JUdc1GpUo0WSxEGthlib
ZEmV2nKldSVNrFSJtbikgViTd3OnLBzEfgggXJhRzqyy8KSc+7aTKNfH2aax
Bmi/Pucbp0twd72Im4qJXblbkslxVyDBIEqDMUHm7S2K5MTBnlYIPNBpgJjC
fgxql2cDoO6QTmRZLyMgi1MMSlyndZkA+SYE/WM+Dyy8vW46wFF53fx35VU6
cdGKW41ZLPYWlz6EzOfq4kpwp/fSAm6D7Ll5EabuMaIhmnZBSgyljAuhQZPF
qdUi6jcxuESz9+zXQbWzKcsIbEDAH0wi0d84DxhH9CdN4ylKFk6yzgiva6w6
O1zmMVRLBCGWJHigrgvAFf0ZhCE8TNTZ8ri2gid+wPCUj1AcOzsN5qk+AoAU
U0StotWpsbmZtkHr5QnqhFmBuGA4hMXFowx1yFXQRG+qMNY3HgfIjEBgKgfj
LP2RLC5Yru/rYe+i6zrD34OPFn+7jETHKwkerTqvTwm8MhjfaMK9af3OalPt
ZRD+6BaELyN2C8LXIXYLwltzdgvCb0F4VyD80S0It39yNO8WhHvNINx681MC
4Y+WgfBHHwKEy3ntzWbll+Bv8ZY5Gb8x/i5VXIe9rddWxN0yQF4Fe7MyfdJN
mnlVB+hlLJnwrBmm9+iMMqgmnNFKf3hd4SJ4h2q4QbSte+8WbN8Q2LaVe81J
btVrfwCcXTYG62DshrXzB2tg7IObwtiP1lvObp9rN1RnAA5LZwBWJ8Zdx9tV
oonqp/WiOPsjPHiLJXLVYkX0ulIxr5gtaltszbYZVe6/WbVEN8sc5aaDdlmO
NxZ4KzHVap3+WhsPaiVQ/2AFMpvuYOiMm662Qmy0GcJBZa1DIBxU1t8QoRC1
ONJt5WMhutkUUZ9WWWlbxPobI7rcGtHx5ohutkesuUFi4y0S5UndjTZJdLhN
oquNEh1tleh+s0Qn2yU23qawPNXgTjR0xE1HPrzLGf5DV3LhsF1yoZjx2hP1
7FUOfVAgbIPpfUctbeb2N6i5Lr9wbZPvAzn5fkOT7Wo9e4Yr9mlRu5WBEFwg
5BhhHkUCPdHQnsd/mbI8zZakKswMg7sD9YVPq6QXxK4DV4oAehxPi0/wOj1u
7XMwzxe3EgKV3END2qKD7IN3s+mH9fW/ZQrC3a2dJSAOP9oExA1M9Detu69L
QgzU7n17X86HmfE/rM1GLDvhry58cDrV5Rv32qcTVkwj6PTBoFX6YJ20Qft0
QTdpghbpgfXTAhumAzbE75sD7s2A9mYAezNgvR6g3hBIbwigVwfOXQDmjoDy
ZgB5RWC8ISBW42pNINwBAN4U+G4IeLsDuhsB3DWBbUeA9iaBLALYwTIAO7gR
ACuilo3Ba828eIk6BMcYPWPgo/anr4tZW86HVyasIyGNyrT14NqmrStgsCqU
tYDgJ40EP04guNY8dLU7/wAAsIsZ6EGnM9AHN479ButgP3Gz3om8VLz+wvSh
ugsYXgUjAzrgV148U+pA16f2xFV7OJxEh+oLs7GF4FTEYFL3ZM59UNV+pb7i
Yvljuli+yyr9uVFlhzfGb8yYvmn+/QqX43ZQ67muEVpGF/9BFd8i/QyfPmHg
7on8VF7srd4hHjKRMcWTOBhEBKCkM+5NwngEZp8OsRnjOa8+uRFVEHzLPI5w
SZrgkMoEEV0umcObxe/FXig8BTGQ1sZ4NSvSsudxGMaX4hiPhM14hqlVGgGk
/OZxwPoqxhn3pywK0hnKDnlLveevT8/QysrsEt5uaJziIxohRpG4hNiiO9Q1
Uz7Fw3QjvTbm4L85iDAZX2JzRLPPM/qCSWD0r6IImM/H4u2T4WsvjNlYPAfL
ptIBWTyPw3iyED88lHUU9wnLE24lQXCLpwuwgrOSEHZ4f9IHdZkuUuogEBU4
hQzFOoNOD3DVHbDQAynNYjw/0oeneEMlXvUozrxNiW5PnIp6jsv41P2vu6Lu
z3VjnHVXTv/sFThAUvhz3+4pab+8p8cvjulIXfSpIj7x3t3Fp+/FoMALdGM/
p5wJqm4UizKMbGQqVf6Ug9PGc4ErpFL5y3s1xkZg/qYzlpBhIsVD7+CDrJWS
mFVix4bBLDAOwDXvn4bWJMAJT4JfmbLZ2E6hKKm4aT1kI9NTCzcQpLG4j1X6
ctGSCoNRSVegcBiiYjMMGcZ8Dh6Rgj1tOUgpSf1fvKQhYEUCijEakTO2UOfm
VOMBUIYYmAczNM6FP5K8EKMAXWg+Q4j/2H8bxZchHu9D99+C3MuP3hNGEFeO
8/Ff7pyzMOUitn/OIkQnLHpLkzrHwU955P2TkRpBM7/j+NdpnsLf3zM0gTzz
+2r5YwDqzsKcTDpFu1h9XOrG/tb/A97PS/CaBwEA

-->

</rfc>
