My Wiki!

Table of Contents

ODL MD-SAL Tutorial

2. Application development – a peek through
	2.1. Environment seting
		- pre-requisits
		- maven project setup etc.
	
	2.2. Project Scaffolding
		- provide the steps for project bootstrapping (identify what are the outcomes of this 		subsection)
		- explain the pom.xml 
		- complete the scaffolding by generating (configuration files)
	2.3. Resolving project dependencies
	2.4. Feature bundle preparation
		- remove the complexities of features and feature
		- pom of parent
		- complete the feature xml file
		- discuss options to make the features available to karaf
2.5. karaf distribution configuration
2.6 Developing provider plugin
- provide the details of provider with the aid of figure
- provide yang file
- put the screenshots
2.7. Developing consumer plugin
2.8.

Introduction and use case scenario

  1. Through figures try to explain MD-SAL/AD-SAL etc.
  2. Explain the diagram that was shown in the meeeting.
  3. Explain the plugin, consumer, provider

Opendaylight (ODL) controller consists of a set of protocol implementations and applications, which can be packaged together to provide the intended control functions. In essence, Opendaylight is an OSGI container [http://en.wikipedia.org/wiki/OSGi], a service platform for the Java programming language that implements a complete and dynamic component model. Applications or components, coming in the form of bundles for deployment, can be remotely installed, started, stopped, updated, and uninstalled without requiring a reboot; management of Java packages/classes is specified in great detail. Application life cycle management is implemented via APIs that allow for remote downloading of management policies. The service registry allows bundles to detect the addition of new services, or the removal of services, and adapt accordingly.

This tutorial has two main topics, summary of essential concepts of Opendaylight framework, and walk through the development of various components (bundles) constituting a functioning controller for an SDN use-case.

Opendaylight Software Framework

When an Opendaylight distribution is downloaded and started, the OSGI container loads and manages necessary bundles. The bundles can be either packaged together with the distribution or downloaded from remote maven repository. They then can be loaded automatically on start or manually by user from the container's console or command line interface.

We elaborate on some essential framework concepts of Opendaylight.

OSGI Container - Karaf

Apache karaf is a lightweight OSGI container. It is used as the main distribution method of Opendaylight. Karaf is the application platform for Opendaylight controller components, analogous to some popular modular web application containers, which host Java implementations for a web-app's frontend, backend, database interaction. We use karaf to refer to the Opendaylight controller when discussing the software bundles and components.

Bundle

The implementation of the controller functionality is compiled and packaged as a modular Java package called bundle in order to be managed by karaf. Inside karaf container, all code packages are treated as bundle, regardless if they implement user interface, Openflow protocol, application logic, etc.

Northbound Applications and Services

Software designers use the terms northbound and southbound to refer a functionality implementation or a component of the controller. Implementations of network applications, orchestration, services and user interfaces are business use-case specific. They are placed above SAL. Bundles belonging to core controller services, functions and extensions are also placed here.

Southbound Interfaces and Protocols

Implementations of protocols and vendor-specific interfaces allow the controller to interface with hardware, virtual components in the data plane. Example of the protocols are Openflow, Netconf, SNMP. The bundles compiled for these purposes are placed under the SAL.

SAL – Software Abstraction Layer

Opendaylight is by itself a programming framework for network applications, whichs extend OSGI framework by its own implementation of a middleware called SAL. SAL provides mechanisms for the communication and coordination between opendaylight bundles. SAL connects data providing bundles with those consuming data. It is basically a data exchange and adaptation mechanism between plugins.

Plugin - Feature

Opendaylight bundles communicate over SAL middle-ware. From programming perspective, all bundles are treated as plugins to SAL.

Feature is another definition relating to Karaf. It is a collection of bundles, whose code implement a certain functionality. Features is used in a sense that when they are loaded, the functionality they implement are added to the controller. E.g, openflow-plugin feature includes an Openflow protocol implementation bundle, flow rules manager bundle, bundles implementing RPCs, among others.

Consumer Plugin

Software developers differentiate between consumer and provider plugin. Consumer plugins register with SAL to receive notification events when there are changes to the system status. E.g, events are created when a packet is sent from a switch to the controller. Consumer plugins may desire changes to the status of network elements. They ask SAL to return appropriate RPCs implementations for their purposes. As such consumer plugins often implement northbound bundles.

Provider Plugin

Provider plugins implement interfaces with network elements in data plane. The implementations of operations on data plane are exposed in form of RPCs. The plugins register their RPCs with SAL so that requests for RPCs from consumer plugins are routed to appropriate RPC implementation. In MD-SAL, provider plugins also register with SAL to operate on MD-SAL internal data store, which SAL uses to maintain states of the controller and network elements. Modifications to this data store result in notification events sent out by SAL to other plugins.

Model-Driven MD-SAL

In the MD-SAL, the SAL APIs and request routing between consumers and providers are defined from models, and data adaptations are provided by 'internal' adaptation plugins. The API code is generated from models when a plugin is compiled. When the plugin OSGI bundle is loaded into the controller, the API code is loaded into the controller along with the rest of the plugin containing the model. From the point of view of MD-SAL, the Adaptation Plugin is a regular plugin. It provides data to the SAL, and consumes data form the SAL through APIs generated from models. An Adaptation Plugin basically performs model-to-model translations between two APIs. Request Routing in the MD-SAL is done on both protocol type and node instances, since node instance data is exported from the plugin into the SAL (the model data contains routing information).

The simplest MD-SAL APIs generated from models (RPCs and Notifications, both supported in the yang modeling language) are functionally equivalent to AD-SAL function call APIs. Additionally, the MD-SAL can store data for models defined by plugins. Provider and consumer plugins can exchange data through the MD-SAL storage (more details in later sections). Data in the MD-SAL is accessed through getter and setter APIs generated from models. Note that this is in contrast to the AD-SAL, which is stateless.

The functionality provided by the MD-SAL is basically to facilitate the plumbing between providers and consumers. A provider or a consumer can register itself with the MD-SAL. A consumer can find a provider that it is interested in. A provider can generate notifications; a consumer can receive notifications and issue RPCs to get data from providers. A provider can insert data into SAL’s storage; a consumer can read data from SAL’s storage.

API-Driven AD-SAL

Registration Subsystem

When loaded, providers register RPCs they implement with SAL, consumers register with SAL for configuration data notification. The SAL components responsible for this is call Configuration Subsystem, which consists of 3 SAL services: RPC-Registry, Notification-Service, Data Broker. MD-SAL plugins are defined in .yang files, which contain the descriptions of the plugins in YANG modeling language. The .yang files also specify the dependency on SAL-Services, which are needed by the plugins.

Use-case (MD-SAL): Openvswitch Flows Management with OF 1.3

Understanding above concepts, in this section we describe how some use-cases in a flows management applications are realized with Opendaylight framework. In these scenrios, the 'Flow Programmer Service' is a northbound or consumer plugin. OF plugin is a southbound, producer plugin.

OF Library is a set of Java code implementing the Openflow protocol. This library is the interface to communication with data plane elements with Openflow capability. The library is used by OF plugin to create messages and state data required to manage network elements. Within the controller, OF Library expose RPCs allowing other plugins to call them with parameters to make modification to the network elements' states. E.g, addFlow() and delFlow() are the RPCs used by the Flow Programer Service, which has the logic to manage the network elements.

"Flow Deleted" notification scenario

The scenario is as follows:

  • The Flow Programmer Service registers with the MD SAL for the “Flow Deleted” notification. This is done when the controller and its plugins or applications are started.
  • A “Flow Deleted” OF packet arrives at the controller. The OF Library receives the packet on the TCP/TLS connection to the switch that is sending and passes it to the OF Plugin.
  • The OF Plugin parses the packet and uses the parsed data to create a “Flow Deleted” SAL notification. The notification is actually an immutable “Flow Deleted” Data Transfer Object (DTO) that is created or populated by means of methods from the model-generated OF Plugin API.
  • The OF Plugin sends the “Flow Deleted” SAL notification (containing the notification DTO) into the SAL. The SAL routes the notification to registered consumers, in this case, the Flow Programmer Service.
  • The Flow Programmer Service receives the notification containing the notification DTO.
  • The Flow Programmer Service uses methods from the model-generated OF Plugin API to get the data from the immutable notification DTO received in Step 5. The processing is the same as in the AD-SAL.

"Add Flow" scenario

The scenario is as follows: Need simplification

  • Registrations are performed when the controller and its plugins or applications are started. a)The Flow Programmer Service registers with the MD SAL for Flow configuration data notifications, and b) The OF Plugin registers (among others) the ‘AddFlow’ RPC implementation with the SAL. Note that the RPC is defined in the OF Plugin model, and that the API is generated during build time.
  • A client application requests a flow add through the REST API of the controller. (Note that in the AD-SAL, there is a dedicated NB REST API on top of the Flow Programming Service. The MD-SAL provides a common infrastructure where data and functions defined in models can be accessed by means of a common REST API. For more information, see http://datatracker.ietf.org/doc/draft-bierman-netconf-restconf/). The client application provides all parameters for the flow in the REST call.
  • Data from the ‘Add Flow’ request is deserialized, and a new flow is created in the Flow Service configuration data tree. (Note that in this example, the configuration and operational data trees are separated; this may be different for other services). Note also that the REST call returns success to the caller as soon as the flow data is written to the configuration data tree.
  • Since the Flow Programmer Service is registered to receive notifications for data changes in the Flow Service data tree, the MD-SAL generates a ‘data changed’ notification to the Flow Programmer Service.
  • The Flow Programmer Service reads the newly added flow, and performs a flow add operation (which is basically the same as in the AD-SAL).
  • At some point during the flow addition operation, the Flow Programmer Service needs to tell the OF Plugin to add the flow in the appropriate switch. The Flow Programmer Service uses the OF Plugin generated API to create the RPC input parameter DTO for the “AddFlow” RPC of the OF Plugin.
  • The Flow Programmer Service gets the service instance (actually, a proxy), and invokes the “AddFlow” RPC on the service. The MD-SAL will route the request to the appropriate OF Plugin (which implements the requested RPC).
  • The “AddFlow” RPC request is routed to the OF Plugin, and the implementation method of the “AddFlow” RPC is invoked.
  • The “AddFlow” RPC implementation uses the OF Plugin API to read values from the DTO of the RPC input parameter. (Note that the implementation will use the getter methods of the DTO generated from the yang model of the RPC to read the values from the received DTO.)
  • The “AddFlow” RPC is further processed (pretty much the same as in the AD-SAL) and at some point, the corresponding flowmod is sent to the corresponding switch.

Application Development – A Peek Through

Environment Setings

  1. pre-requisits
  2. maven project setup etc.

pre-requisites

Java 1.7.0

Note that Java 1.7.0 is required instead of the newer Java 1.8.0. Although its support ended, Opendaylight relies on a Java package (??), which is not ported (in near future) to support Java 1.8.0.

sudo yum install java-1.7.0-openjdk
sudo yum install java-1.7.0-openjdk-devel

Ubuntu:

sudo apt-get install openjdk-7-jre
sudo apt-get install openjdk-7-jdk

Heap Size

When developing, karaf is often started from console using ./bin/start, set the flowing minimum memory parameter for karaf:

.bashrc
## Karaf
export JAVA_MIN_MEM=512M # Minimum memory for the JVM
export JAVA_MAX_MEM=1024M # Maximum memory for the JVM
export JAVA_PERM_MEM=128M # Minimum perm memory for the JVM
export JAVA_MAX_PERM_MEM=256M # Maximum memory for the JVM

If karaf is run as service, set the maximum java heap size in the etc/karaf-wrapper.conf:

  Maximum Java Heap Size (in MB)
  wrapper.java.maxmemory=1024

Maven Setup

Apache Maven newer than 3.1.1 is required.

http://maven.apache.org/download.cgi

# Shortcut command for grabbing settings.xml
cp -n ~/.m2/settings.xml{,.orig} ; \wget -q -O - https://raw.githubusercontent.com/opendaylight/odlparent/master/settings.xml > ~/.m2/settings.xml

vi ~/.bashrc
Add the following:
export MAVEN_OPTS='-Xmx1048m -XX:MaxPermSize=512m'

MD-SAL Plugin Development Process

MD-SAL works with models defined using the YANG modelling language for the NETCONF network configuration protocol. The name is an acronym for “Yet Another Next Generation”.The YANG data modeling language was developed by the NETMOD working group in the Internet Engineering Task Force (IETF) and was published as RFC 6020.

Using model allow the application of a code generator, provided by yang-tools, for different plugin development projects. The compatibility of plugins is also maintained accross projects.

  • The development process starts with the production of a .yang file, which contains model definition, RPC methods on model entities. For service plugins, the files contain only dependencies declaration to wire the plugins with SAL framework. Yang-tools is used to generate Java code from the models.
  • The Java source contain API definitions and Stub-code, which can be extended with the plugins' own logic implementations and use generated API code.
  • From here the development process is the same as in any maven based java projects: maven build, test, package, install.

The next sections will walk through the development of Opendaylight plugins.

Project Scaffolding

  2.2. Project Scaffolding
	- provide the steps for project bootstrapping (identify what are the outcomes of this 		subsection)
	- explain the pom.xml 
	- complete the scaffolding by generating (configuration files)

Opendaylight bundles / plugins project can use maven for convenient packaging and dependency management. There are maven MD-SAL archetype available, called MD-SAL-App-Simple, for generating project structure and stub code. A maven archetype is essentially a maven project template. It is intended to give developers a head start in developing applications and allows developers to ignore some of the more complex MD-SAL features during their initial hacking.

Run the Maven Archetype

'cd' to the folder where you want to place your MD-SAL application. This archetype will generate a parent bundle and 4 child bundles.

  mkdir odl_project; cd odl_project
  

Run the following mvn command to create project:

mvn archetype:generate -DarchetypeGroupId=org.opendaylight.controller \
  -DarchetypeRepository=http://nexus.opendaylight.org/content/repositories/opendaylight.snapshot/ \
  -DarchetypeCatalog=http://nexus.opendaylight.org/content/repositories/opendaylight.snapshot/archetype-catalog.xml

Note Possible error

  A required class was missing while executing org.apache.maven.plugins:maven-archetype-plugin:2.3:generate: org/apache/commons/lang/StringUtils
  rm -rf ~/.m2
  #regenerate
  #redownload settings.xml

When prompted choose the templat for md-sal-app

  org.opendaylight.toolkit:md-sal-app-simple
  

Maven will ask for standard project parameters:
groupId: The group id is a high level grouping, starting with the company or organization, such as org.opendaylight.controller.odl_project'

artifactId: The artifact ID will be the prefix for the folder names and all sub-maven project. Typically this describes the functionality or group project, such as “odl-flow-app” etc.

version: The version of this artifact. You can hit enter to accept the default of “1.0=SNAPSHOT” or enter your own version.

package: Defines the prefix for all packages. This defaults to a package created from the group id.

modelFields: This is a property which you define fields when using the MD-SAL archetype with your own data model. See section about 'model' below. For now, simple accept the defaults, which is “task” model.

The maven archetype generation should now have completed, resulting in the following folder structure:

<artifactId>
    consumer
    generate
    model
    provider
    web
    features

Before we discuss what each of these bundles provides, continue to the next section to finish the template generation.

Finish Generation - Run the ‘Generator’

These quick steps outline the commands you run one immediately following the creation of these projects.

Because maven archetypes are extremely limited in their capabilities,we need to run one more step to generate the remainder of the template code which we need for our sample application. To do this we need to make use of the “generate” folder which was created in the previous step.

cd into the /generate project.
Run mvn clean install -Dgen

This will copy a number of additional files to the remaining four projects and finishes the initialization. cd up one directory and run a complete build

  mvn clean install

You should see that the parent project (named after your artifact id), and four child projects (consumer, model, provider, and web) all compile successfully.

Delete the generate folder as it is no longer needed.

You now have a fully functional MD-SAL application template which provides a number of capabilities. Read on for more information about what each project provides a template for, and which you can discard.

At this point you can import your projects into eclipse or other IDE’s as you desire.

What does the MD-SAL Archetype Provided?

Once the archetype creation completes, you will have 5 sub folders created.

generate

This is a helper folder which is used immediately after generating these files. It allows for us to more intelligently configure your MD-SAL project beyond what a basic maven archetype can do. Running this project with 'mvn clean install -Dgen' as above triggers one java class, CodeGenerator.class, present under generate folder. That CodeGenerator class use velocity templates to generate files under model, provider, consumer, web and configuration files. It generates all YANG model, java service and config subsystem files for your application, based on the name of application.

You can remove the generate projects from your application’s directory structure and its reference in the parent pom, once you create the initial working application and start making your own changes.

model

This folder contains the sample yang file which defines your “model”. The generated file contains sample contents of a yang file with some minimal sample YANG contents, including an RPC call. Generally speaking, you will not need to add any java code here (it is auto-generated from the yang file). There are a few exceptions when writing complex yang files. The only manually modified code in this project is generally the YANG file which models your service.

provider

The provider bundle / plugin sets you up with a service that implements any RPC (remote procedure calls) that were defined in the model's yang files. Additionally, it automatically sets your application up with access to the follow MD-SAL services:

  • DataBroker for reading, writing and listening to changes on models in the store
  • RpcRegistryService for registering your RPC implementation, or, invoking other RpcImplementations defined else where.
  • NotificationProviderService - for sending any Notifications defined in your yang file.

consumer

The consumer bundle illustrates how you could write an application which consumes the RPC services that you provide with your provider bundle (but doesn’t provide any implementations of a service defined in a YANG file etc). You may want to use this template if you have business intelligence which needs to use the RPC implementation defined in the provider, BUT is logically a separate piece of functionality.

In the consumer we only set up initial access to the RpcRegistryService, however with a few modifications you can gain access to the DataBroker and NotificationProviderService as well.

web

The web bundle provides an application that allows you to define customer REST APIs. Remember, you get access to RESTConf web services for your model automatically just for deploying your model. You do NOT need this bundle to use the RESTConf web services.

features

The Karaf features directory provides a list of sample features to enable provider, consumer and web.

Most applications only need the model and the provider bundles.

Additional Generation Parameters

Naming your Application (optional)

By default the word “Task” is used to prefix your YANG files and thus is used in your generated package names, java interfaces etc. To change this prefix, simply add the following option to the maven archetype command line:

  1. DappName=

Artifact ID , Group ID can also be provided with the command line using the parameters:

  1. DgroupId=«Any group Id such as org.opendaylight.controller»
  2. DartifactId=«Any app name such as mdSalExample»

How to Remove Unneeded Projects

To remove unneeded projects, you delete the project (i.e. 'rm -rf web' or 'i.e. rm-rf consumer') and then modify the pom.xml file in the root folder (which is the same name as your artifact) and remove the unneeded projects from the “modules” section. Then open features.xml file present in features/src/main/resources directory. Remove the feature of project that you are removing. Like web or consumer feature. Then run a 'mvn clean install' to verify that all projects still build.

Note, the model and provider projects require the model project to build correctly.

Tips for Editing the Template Code

A few tips to remember when you start to edit the template code:

  • Build often! Your projects should build quickly (especially if you run the build command from the generate root project). Building often will allow you to catch errors quickly.
  • There are a number of auto-generated java classes that are created from your YANG model. After you modify the model YANG file you will want to recompile, which may result in compile errors in the provider or consumer bundles who use that code. Consider commenting the example code out until things compile again.
  • Remove the projects you don’t think you need - this will reduce the time you may be tracing down code generation issues.
  • A .gitignore file is created which ignores files that should not be checked in. If you are not using git make sure you set up the proper ignore files for your source control.
  • Build project with <code>
    mvn clean install -nsu -DskipTests </code>

    Generating yang Model with the MD-SAL-App-Simple Archetype

    The advanced features of the MD-SAL-App-Simple archetype provides better auto generation of a YANG file for you instead of you having to manually edit the file. To do this you will make use of the “modelProperties” field which we skipped (or rather accepted the defaults) before.

When prompted during the interactive mode, or optionally using the ‘-DmodelFields’ flag on the command line, you can provide a JSON formatted list of fields that you want included in your YANG file.

The Top pom.xml - Resolving Project dependencies

This Opendaylight bundle project is also a maven project, so project dependencies are managed exactly like in any maven based Java development project.

Karaf Feature

 remove the complexities of features and feature
 pom of parent
 complete the feature xml file
 discuss options to make the features available to karaf

Apache Karaf provides a simple and flexible way to provision applications. In Apache Karaf, the application provisioning is an Apache Karaf feature.

A feature describes an application as:

  • a name
  • a version
  • a optional description (eventually with a long description)
  • a set of bundles
  • an optional set configurations or configuration files
  • an optional set of dependency features

When you install a feature, Apache Karaf installs all resources described in the feature. Apache Karaf will automatically resolve and install all bundles, configurations, and dependency features described in the feature.

We will have a detailed look at the generated 'features' folder and find out how the feature bundle is used by karaf to manage applications: define a feature with dependency, add feature repo in karaf, deploy feature in karaf.

Define a Karaf Feature

features.xml

The generated 'features' folder contains a feature project, which is a maven-based project, where the Karaf feature is defined in the src/main/resources/features.xml file.

  7     <feature name='odl-task-provider' version='${project.version}'>
  8         <feature version='${yangtools.version}'>odl-yangtools-common</feature>
  9         <feature version='${yangtools.version}'>odl-yangtools-binding</feature>
 10         <feature version='${mdsal.version}'>odl-mdsal-broker</feature>
 11         <bundle>mvn:de.dailab.test/${artifactName}-model/${project.version}</bundle>
 12         <bundle>mvn:de.dailab.test/${artifactName}-provider/${project.version}</bundle>
 13         <configfile finalname="${config.configfile.directory}/05-task-provider-config.xml">mvn:de.dailab.test/${artifactName}-provider/${project.version}/xml/config</configfile>
 14     </feature>

In this code snippet, we have added a feature odl-task-provider. This feature depends on another feature, odl-mdsal-broker, which is defined in opendaylight. During feature installation, all dependent features are automatically installed by karaf if they are available.

pom.xml

In the pom.xml file contained in the features project, we use a maven plugin named build-helper-maven-plugin to make the features.xml available outside the jar in the maven repository. This helps add our feature when running the karaf distribution:

    <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>build-helper-maven-plugin</artifactId>
        <executions>
          <execution>
            <id>attach-artifacts</id>
            <goals>
              <goal>attach-artifact</goal>
            </goals>
            <phase>package</phase>
            <configuration>
              <artifacts>
                <artifact>
                  <file>${project.build.directory}/classes/${features.file}</file>
                  <type>xml</type>
                  <classifier>features</classifier>
                </artifact>
              </artifacts>
            </configuration>
          </execution>
        </executions>
      </plugin>

Build a maven Project

To build a Karaf feature project, in the features directory run:

  mvn clean install

Once done, a jar archive that includes the features.xml and any Jar files consisting of Java code will be available in the local maven repository (there are no such files in this example). The .jar file will be located here:

.m2/repository/com/cisco/controller/samples/features-odlFlowApp/1.0.0-SNAPSHOT/features-odlflowapp-features-1.0.0-SNAPSHOT.jar

Feature Dependency

The Karaf feature we are developing may use service (RPCs, models) provided by other features. Those needed features must be installed to maven repository and specifed in the dependant's definition. The feature.xml file also have to specify locations of those features in the repository:

<repository>mvn:org.opendaylight.yangtools/features-yangtools/${yangtools.version}/xml/features</repository>
<repository>mvn:org.opendaylight.controller/features-mdsal/${controller.mdsal.version}/xml/features</repository>
<!--<repository>mvn:org.opendaylight.controller/features-restconf/${controller.restconf.version}/xml/features</repository>-->

Meta-Feature

The features can be grouped under a single feature, which we call 'Meta-feature'. It often ends wit -all and when loaded, all features included by it will be activated in Karaf:

  <feature name="odl-adsal-all" description="OpenDaylight AD-SAL All Features" version="${sal.version}">
      <feature version="${sal.version}">odl-adsal-core</feature>
      <feature version="${sal.networkconfiguration.version}">odl-adsal-networkconfiguration</feature>
      <feature version="${sal.connection.version}">odl-adsal-connection</feature>
      <feature version="${clustering.services.version}">odl-adsal-clustering</feature>
      <feature version="${configuration.version}">odl-adsal-configuration</feature>
   </feature>

Deploy the Features

To deploy the feature into the Karaf container and make it available for installation, You have two options:

Use the Open SDN Controller Administrative GUI

is this possible?:

    Navigate to the Feature Management screen: select System Management < Features
    Click Add Feature
    Click Browse and select the .KAR file from your local maven repository. Click Open.
    Click Add.
    The feature appears in the list of available features.
    Click Activate to install the feature.
    There should be a check mark next to the feature in the Active column.

Use the Karaf Console:

  • Locate your ODL distribution. The distribution for the core controller is found under the controller git repository (after running a mvn clean install from the controller root). The path is controller/opendaylight/distribution/opendaylight-karaf/target/assembly/bin.
  • Start karaf via command ./karaf.
  • Add your feature repo in the karaf. This can be done in following way, where <artifactId>, <groupId> and <version> are what you supplied before: feature:repo-add mvn:<groupId>/features-<artifactId>/<version>/xml/features

As the features bundle is installed and available in maven .m2 repository, karaf can find it there. Example

  feature:repo-add mvn:org.opendaylight.controller/features-odlFlowApp/1.0-SNAPSHOT/xml/features

Once the feature repo is added, install the feature on karaf console via command such as

feature:install odl-task-web

If you have picked a different appName, the feature name will be odl-<appName>-web.

To confirm that the feature is available for installation, run the following command on the karaf console:

feature:list

It will list all the features available in the karaf instance. You can use grep to reduce the list:

feature:list | grep task

If the feature is visible in the list, the feature can be installed using following command:

feature:install odl-task-provider

Once installed, run the following command to list your feature again:

feature:list | grep task

An X mark should be present besides the feature entry, indicating the feature is installed.

Local Karaf Distribution

Building a local controller is an alternative to downloading an Opendaylight controller configured with many unnecessary or conflicting bundles. The local controller can be configured to include only our bundles and their dependency. All we need is a pom.xml file containing a list of bundles to be packaged into our customized controller. For this task, we can also use a maven archetype.

Create a folder for the Karaf distribution project and generate the controller pom.xml:

  mkdir -p distribution
  cd distribution
  mvn archetype:generate -DarchetypeGroupId=org.opendaylight.controller \
  -DarchetypeRepository=http://nexus.opendaylight.org/content/repositories/opendaylight.snapshot/ \
  -DarchetypeCatalog=http://nexus.opendaylight.org/content/repositories/opendaylight.snapshot/archetype-catalog.xml

Select the Karaf-distro-archetype:

  org.opendaylight.controller:opendaylight-karaf-distro-archetype
  

Input prameters:

  groupId: (enter your project groupId)
  artifactId: distribution-karaf
  version: (version of your project)
  package: (accept default)
  repoName: (your projects repoName, examples: git:opendaylight/controller, yangtools, openflowplugin, mma)

Choosing Features to Packages

The features required for the distribution are of type 'xml' (not jar or kar) and specified in <dependencies>, which tell Karaf to find and download them. As the result, the bundles needed for our Karaf features will be distributed with our controller.

This listing shows features-mdsal must also be included because the provider, consumer plugin in our features-dmm depend on it.

110     <dependency>
111         <groupId>org.opendaylight.controller</groupId>
112         <artifactId>features-mdsal</artifactId>
113         <classifier>features</classifier>
114         <type>xml</type>
115         <scope>runtime</scope>
116         <version>${mdsal.version}</version>
117       </dependency>
118       <dependency>
119         <groupId>de.dailab.nemo.dmm</groupId>
120         <artifactId>features-dmm</artifactId>
121         <classifier>features</classifier>
122         <type>xml</type>
123         <scope>runtime</scope>
124         <version>${project.version}</version>
125       </dependency>

In <build> section, under <bootFeatures>, we can tell karaf-maven-plugin to add features to be included in local local distro:

211     <plugins>
212       <plugin>
213         <groupId>org.apache.karaf.tooling</groupId>
214         <artifactId>karaf-maven-plugin</artifactId>
215         <version>${karaf.version}</version>
216         <extensions>true</extensions>
217         <configuration>
218           <bootFeatures>
219             <feature>standard</feature>
220             <!--
221               Optional TODO: Add entries here for the features you want in your local distro
222               Note: odl-restconf is a separate feature from odl-mdsal-broker.  If you want
223               restconf, you need to list it here explicitely.
224               Examples:
225               <feature>odl-openflowplugin-flow-services</feature>
226               <feature>odl-restconf</feature>
227             -->
228             <!-- Final TODO: Remove TODO Comments ;) -->
229           </bootFeatures>
230         </configuration>

MD-SAL Model

YANG Modelling Language

The auto generated 'task' model is written is YANG, a data modeling language used by the Network Configuration Protocol (NETCONF). A YANG module defines a hierarchy of data that can be used for NETCONF based operations, including configuration, state data, Remote Procedure Calls (RPCs), and notifications. This allows a complete description of all data sent between a NETCONF client and server.

We have defined a task module inside the model/src/main/yang/task.yang file. This module contains such quantities as grouping, container, leaf, rpc, and notification. We will explain each of them and how you can use them to define a model.

  1 module task {
  2 
  3     yang-version 1;
  4 
  5     namespace "opendaylight:sample";
  6 
  7     prefix task;
  8 
  9     import ietf-inet-types { prefix "inet"; revision-date 2010-09-24; }

Container Nodes

We have defined a 'task' container inside the module. A container node is used to group related nodes in a subtree. A container has only child nodes and no value. A container may contain any number of child nodes of any type (including leafs, lists, containers, and leaf-lists).

In this case we have defined a list inside the container.

 29     container task {
 30       description
 31         "Top-level container for all application database objects.";
 32       list entry {
 33         key "entry-id";
 34         leaf entry-id {
 35           type entry-id;
 36           description "identifier of single list of entries.";
 37         }
 38         leaf title {
 39           type string;
 40         }
 41         leaf desc {
 42           type string;
 43         }
 44       }
 45     }

Leaf Nodes

  leaf owner {
    type string;
  }

A leaf node contains simple data like an integer or a string. It has exactly one value of a particular typem and has no child nodes.

In the example model, owner leaf is of type string.

List Nodes

Example:

list coffee-log {
    config false;
    key type;

    leaf type {
      type string {
        pattern '[a-zA-Z].*';
      }
    }
    leaf last-make-time {
      type string;
    }
  }

A list defines a sequence of list entries. Each entry is like a structure or a record instance, and is uniquely identified by the values of its key leaf entries. A list can define multiple key leaf entries and may contain any number of child nodes of any type (including leafs, lists, and containers).

In the example model, list coffee-log has two leaf entries type and last-make-time as its children. type leaf is the key of the list, so type must be unique to identify the entry in the list.

Derived Types (typedef)

Example

typedef cm-response-type {
    type enumeration {
        enum done { value 0;}
        enum error { value 1;}
    }
}

Recall that owner leaf is of type string. String is a built-in type in YANG. YANG allows us to define derived types . We can define derived types from base types using the typedef statement. A base type can be either a built-in type or a derived type, allowing you to develop a hierarchy of derived types.

In our model, we have defined cm-response-type as derived-type.

Reusable Node Groups (grouping)

Example

grouping supplies {

  leaf water-level {
    config false;
    type uint8;
  }

  leaf coffee-supply-level {
    config false;
    type uint8;
  }

}

Groups of nodes can be assembled into reusable collections using the grouping statement. A grouping defines a set of nodes that are instantiated with the uses statement.

In our model, we have defined supplies as a grouping definition that has two leaf nodes under it. This grouping can be used inside a container as follows:

uses supplies; 

RPC Definitions

YANG allows you to define NETCONF RPCs. The operations' names, input parameters, and output parameters are modeled using YANG data definition statements.

In our model, we added the following RPC definition:

 47     rpc saveEntry {
 48      description " Method to add a new entry into datastore.";
 49      input {
 50        list entryField {
 51          leaf key {
 52            type string;
 53            description "name of the field";
 54          }
 55 
 56          leaf value {
 57            type string;
 58            description "value of the field";
 59          }
 60        }
 61 
 62        leaf entryId {
 63          type string;
 64          description "entry Identifier";
 65        }
 66      }
        output {
         leaf additional-info {
        type string;
       }
     } //output
 67     }

In our service provider, we will implement a saveEntry RPC, and this operation will take a leaf named entryId of type String, and a list of entryField with two 'leaf's of type String as input. This will act as an API to make saveEntry on request.

The output of a RPC will also be specified here with leaf(s) containing returned data. They can be of the derived type custom-type or String.

Notification Definitions

Example

notification task-event {
    uses supplies;
}

YANG allows you to define notifications suitable for NETCONF. YANG data definition statements are used to model the content of the notification.

In the example, we have added a task-event notification providing details about the supplies, which a service provider can emit. Another service/consumer can add a listener for that notification and perform the appropriate actions.

Generate Java Code from YANG Model

YANG Maven Plugin

With the help of the yang-maven-plugin available in the yangtools project of opendaylight, we will generate Java code for our YANG model.

In the pom.xml file of dmm-model project, we can configure yang-maven-plugin to generate Java code for the task model from task.yang file. This is the listing of the plugin in <build> section:

 34             <plugin>
 35                 <groupId>org.opendaylight.yangtools</groupId>
 36                 <artifactId>yang-maven-plugin</artifactId>
 37                 <version>${yangtools.version}</version>
 38                 <executions>
 39                     <execution>
 40                         <goals>
 41                             <goal>generate-sources</goal>
 42                         </goals>
 43                         <configuration>
 44                             <yangFilesRootDir>src/main/yang</yangFilesRootDir>
 45                             <codeGenerators>
 46                                 <generator>
 47                                     <codeGeneratorClass>org.opendaylight.yangtools.maven.sal.api.gen.plugin.CodeGeneratorImpl</codeGeneratorClass>
 48                                     <outputBaseDir>${salGeneratorPath}</outputBaseDir>
 49                                 </generator>
 50                             </codeGenerators>
 51                             <inspectDependencies>true</inspectDependencies>
 52                         </configuration>
 53                     </execution>
 54                 </executions>
 55                 <dependencies>
 56                     <dependency>
 57                         <groupId>org.opendaylight.yangtools</groupId>
 58                         <artifactId>maven-sal-api-gen-plugin</artifactId>
 59                         <version>${yangtools.version}</version>
 60                         <type>jar</type>
 61                     </dependency>
 62                 </dependencies>
 63             </plugin>

The Java implemetation of the code generator is 'org.opendaylight.yangtools.maven.sal.api.gen.plugin.CodeGeneratorImpl', which is automatically available when maven download the yang-maven-plugin.

'<outputBaseDir>${salGeneratorPath}</outputBaseDir>' tell maven where to put the generated code. ${salGeneratorPaht} is a maven options that resovled to model/src/main/yang-gen-sal.

Generated Code

Once we run maven build in model project, the generated files will be located in the org.opendaylight.yang.gen-v1.opendaylight.sample.rev140407 package.

The resulting source code contain Data Transport Object (DTO) classes and builder classes. These classes

[dang@dai142 dmm]$ tree model/src/main/yang-gen-sal/org/opendaylight/yang/gen/v1/opendaylight/sample/rev140407/
model/src/main/yang-gen-sal/org/opendaylight/yang/gen/v1/opendaylight/sample/rev140407/
├── EntryId.java
├── saveentry
│   └── input
│       ├── EntryFieldBuilder.java
│       └── EntryField.java
├── SaveEntryInputBuilder.java
├── SaveEntryInput.java
├── task
│   ├── EntryBuilder.java
│   ├── Entry.java
│   └── EntryKey.java
├── TaskBuilder.java
├── TaskData.java
├── Task.java
├── TaskService.java
├── $YangModelBindingProvider.java
└── $YangModuleInfoImpl.java

Lets look at some of the generated java files and how they are mapped with our YANG model.

$YangModuleInfoImpl.java

his class provides basic information about the YANG module, including the name, namespace, and revision. This class has a private constructor: use the $YangModelBindingProvider class to create an instance of $YangModuleInfoImpl.

TaskData.java

The task container stores the data in the module. This class gives you methods to access the container object, TaskData, which stores data at the root level.

Data and Builder Classes

Task is the Java interface for the container representation defined in the task.yang model. The implementation of this interface is a private class inside TaskBuilder.java. The YANG maven module generates a builder class for each entity, and these builder classes are used to create the final object. When we create a Task object, we pass the required instance variables.

Each builder class provides a build method to create a new object.

Generated classes for RPC

Our YANG model contains a saveEntry RPC. The YANG maven plugin generates a Service interface if an RPC definition exists in the module. In this case the TaskService.java interface is generated to provide access to the saveEntry RPC.

The saveEntry RPC method accepts a SaveEntryInput object as input. SaveEntryInput.java has ab uilder class, and, like the TaskBuilder class, they are used to create new input object.

A service provider can implement TaskService to provide the saveEntry RPC service.

Generated Notification Classes

Though not available in the generated project. For completeness we detail on notification classes here. A FooListener interface is generated by the YANG maven plugin for the foo-event notification if specified in the YANG model. Any provider class can implement this listener class for foo-event notification events.

Another service provider class can use the FooEventBuilder class to create an object of type FooEvent and emit these events when certain conditions are met.

Provider Plugin

  2.6 Developing provider plugin
   provide the details of provider with the aid of figure
   provide yang file
   put the screenshots

MD-SAL plugins also has an associated YANG file, which among others, specifies the SAL services they need in order to work together. Those services belong to SAL's Config Subsystem. To recall, there are three basic services provided by MD-SAL to wire plugins with SAL framework: RPC-Registry, Notification Service and Data Broker.

Data Broker: The Data Broker provides access to the MD-SAL data store, including reading and writing data, as well as receiving notifications of model changes. Your application will be given a reference to the Data Broker on initialization.

RPC Registry: The RPC Registry is used coordinate RPC calls made by external entities - other applications, users - into your application. You will receive a reference to the RPC Registry on initialization.

Notification Service: The Notification Service is used to generate notifications to listeners who have registered for events related to your application. You will receive a reference to the Notification Service on initialization.

Config Subsystem Service Dependency

In the previous section, we created a YANG model, task.yang, that has RPC definition. To use that model, we must write an RPC implementation and add notification listeners. Before we write a provider to perform such operations, we must include the required MD-SAL dependencies.

All of MD-SAL's dependencies can be included via the config subsystem. In this sample, we will explain how a provider can get the MD-SAL dependencies from the config subsystem.

Service Dependency Configuration

As there is a YANG file describing out plugin, we can again use the yang-maven-plugin to generate Java code to help with wireing into the config subsystem. The plugin is configured as follow in the provider project's pom.xml:

 19     <build>
 33             <plugin>
 ..........................
 34                 <groupId>org.opendaylight.yangtools</groupId>
 35                 <artifactId>yang-maven-plugin</artifactId>
 36                 <version>${yangtools.version}</version>
 37                 <executions>
 38                     <execution>
 39                         <id>config</id>  <-------- ?????
 40                         <goals>
 41                             <goal>generate-sources</goal>
 42                         </goals>
 43                         <configuration>
 44                             <codeGenerators>
 45                                 <generator>
 46                                     <codeGeneratorClass>org.opendaylight.controller.config.yangjmxgenerator.plugin.JMXGenerator</codeGeneratorClass>
 47                                     <outputBaseDir>${jmxGeneratorPath}</outputBaseDir>
 48                                     <additionalConfiguration>
 49                                         <namespaceToPackage1>urn:opendaylight:params:xml:ns:yang:controller==org.opendaylight.controller.config.yang</namespaceToPackage1>
 50                                     </additionalConfiguration>
 51                                 </generator>
 52                                 <generator>
 53                                     <codeGeneratorClass>org.opendaylight.yangtools.maven.sal.api.gen.plugin.CodeGeneratorImpl</codeGeneratorClass>
 54                                     <outputBaseDir>${salGeneratorPath}</outputBaseDir>
 55                                 </generator>
 56                             </codeGenerators>
 57                             <inspectDependencies>true</inspectDependencies>
 58                         </configuration>
 59                     </execution>
 60                 </executions>
 61                 <dependencies>
 62                     <dependency>
 63                         <groupId>org.opendaylight.yangtools</groupId>
 64                         <artifactId>maven-sal-api-gen-plugin</artifactId>
 65                         <version>${yangtools.version}</version>
 66                         <type>jar</type>
 67                     </dependency>
 68                     <dependency>
 69                         <groupId>org.opendaylight.controller</groupId>
 70                         <artifactId>yang-jmx-generator-plugin</artifactId>
 71                         <version>0.2.5-SNAPSHOT</version>
 72                     </dependency>
 73                 </dependencies>
 74             </plugin>

<WRAP center round important 90%> ERROR The above code run code generation twice, one to the specified <outputBaseDir>${salGeneratorPath}</outputBaseDir>, under execution id “config”. The other as default to target/generated-sources. –

> Duplicate class error.

Do a mvn help:effective-pom to see details. If this is from the md-sal archetype –

> TODO bug report!!!!

How to fix?

remove “id” from “execution” </WRAP>

Config Subsystem Dependency Configuration

The task-provider-impl.yang file is used by the code generator to create Java classes. This listing show the declaration of dependency on the three Config Subsystem Services:

 20     // This is the definition of the service implementation as a module identity.
 21     identity task-provider-impl {
 22             base config:module-type;
 23 
 24             // Specifies the prefix for generated java classes.
 25             config:java-name-prefix TaskProvider;
 26     }
 27 
 28     // Augments the 'configuration' choice node under modules/module.
 29     // We consume the three main services, RPCs, DataStore, and Notifications
 30     augment "/config:modules/config:module/config:configuration" {
 31         case task-provider-impl {
 32             when "/config:modules/config:module/config:type = 'task-provider-impl'";
 33 
 34             container rpc-registry {
 35                 uses config:service-ref {
 36                     refine type {
 37                         mandatory true;
 38                         config:required-identity mdsal:binding-rpc-registry;
 39                     }
 40                 }
 41             }
 42 
 43             container notification-service {
 44                 uses config:service-ref {
 45                     refine type {
 46                         mandatory true;
 47                         config:required-identity mdsal:binding-notification-service;
 48                     }
 49                 }
 50             }
 51 
 52             container data-broker {
 53                 uses config:service-ref {
 54                         refine type {
 55                         mandatory false;
 56                         config:required-identity mdsal:binding-async-data-broker;
 57                     }
 58                 }
 59             }
 60         }
 61     }

When the project is compiled, 'mvn clean install', a list of files generated under src/main/yang-gen-config and src/main/yang-gen-sal directories. Do not make any changes in the java files generated in those directories, and do commit them to your git/svn repository.

There are two more files generated under src/main/java: TaskProviderModule.java and TaskProviderModuleFactory.java. Update these files to wire MD-SAL dependencies.

[dang@dai142 dmm]$ tree provider/src/main/java/org/opendaylight/controller/config/yang/config/task_provider/impl
provider/src/main/java/org/opendaylight/controller/config/yang/config/task_provider/impl
├── TaskProviderModuleFactory.java
└── TaskProviderModule.java

TaskProviderModuleFactory.java

TaskProviderModuleFactory.java is a concrete class, instantiated internally by MD-SAL, that creates an instance of TaskProviderModule. The TaskProviderModule.createInstance() method must be implemented to instantiate and wire the service provider. In this sample, the method contains this code snippet:

 14 public class TaskProviderModule extends org.opendaylight.controller.config.yang.config.task_provider.impl.AbstractTaskProviderModule {
...
30     @Override
 31     public AutoCloseable createInstance() {
 32         final TaskProvider appProvider = new TaskProvider();
 33
            // Need data brocker to read/write to MD-SAL Config? and Operational Datastore
 34         DataBroker dataBrokerService = getDataBrokerDependency();
 35         appProvider.setDataService(dataBrokerService);
 36         // Registration with global RPC
 37         RpcProviderRegistry rpcRegistryDependency = getRpcRegistryDependency();
 38         final BindingAwareBroker.RpcRegistration<TaskService> rpcRegistration =
 39                                 rpcRegistryDependency
 40                                     .addRpcImplementation(TaskService.class, appProvider);
 41 
 42         //retrieves the notification service for publishing notifications
 43         NotificationProviderService notificationService = getNotificationServiceDependency();
 44 
 45 
 46         // Wrap toaster as AutoCloseable and close registrations to md-sal at
 47         // close()

The createInstance() method is called by MD-SAL when the plugin is loaded. At line 32, an implementation of the provider plugin, which contains the RPC saveEntry(), is instantiated. At line 34-43 the plugin is wired with SAL by getting references to Data Brocker, Notification Service, and registering it RPC implementation with SAL Registration Service.

These dependency objects are now available because we have added the required wiring in the task-provider-impl.yang file.

How Karaf Knows about the Config Subsystem Dependency?

Similar to the features.xml that help Karaf find the location of features in maven repository, the plugins also have xml-files containing the Config Subsystem Services they require. This file is used by karaf to initialize a module in the config subsystem and make it available for use in a controller environment. Let's look at the xml configuration file for our provider plugin 'provider/src/main/resources/configuration/initial/05-provider-config.xml':

 10 <snapshot>
 11     <configuration>
 12         <data xmlns="urn:ietf:params:xml:ns:netconf:base:1.0">
 13             <modules xmlns="urn:opendaylight:params:xml:ns:yang:controller:config">
 14                 <module>
 15                     <type xmlns:prefix="urn:opendaylight:params:xml:ns:yang:controller:config:task-provider:impl">
 16                         prefix:task-provider-impl
 17                     </type>
 18                     <name>task-provider-impl</name>
 19 
 20                     <rpc-registry>
 21                         <type xmlns:binding="urn:opendaylight:params:xml:ns:yang:controller:md:sal:binding">binding:binding-rpc-registry</type>
 22                         <name>binding-rpc-broker</name>
 23                     </rpc-registry>
 24                     <data-broker>
 25                         <type xmlns:binding="urn:opendaylight:params:xml:ns:yang:controller:md:sal:binding">binding:binding-async-data-broker</type>
 26                         <name>binding-data-broker</name>
 27                     </data-broker>
 28                      <notification-service>
 29                         <type xmlns:binding="urn:opendaylight:params:xml:ns:yang:controller:md:sal:binding">
 30                             binding:binding-notification-service
 31                         </type>
 32                         <name>binding-notification-broker</name>
 33                     </notification-service>
 34                 </module>
 35 
 36             </modules>
 37         </data>
 38 
 39     </configuration>
 40 
 41     <required-capabilities>
 42         <capability>urn:opendaylight:params:xml:ns:yang:controller:config:task-provider:impl?module=task-provider-impl&amp;revision=2014-05-23</capability>
 43     </required-capabilities>
 44 
 45 </snapshot>

Maven Plugin

In the pom.xml of provider, we added a maven plugin to ensure the file is available outside the jar for reading. Once the file is available in the maven repository, we can include this file in features.xml.

Recall that in the feature definition for plugin, beside name, version, feature dependency list and bundle list, the tag <configfile> specifies the location of the plugin in maven repository:

   <configfile finalname="${config.configfile.directory}/05-task-provider-config.xml">mvn:de.dailab.nemo.dmm/dmm-provider/${project.version}/xml/config</configfile>
   
 74             <plugin>
 75               <groupId>org.codehaus.mojo</groupId>
 76               <artifactId>build-helper-maven-plugin</artifactId>
 77               <executions>
 78                 <execution>
 79                   <id>attach-artifacts</id>
 80                   <goals>
 81                     <goal>attach-artifact</goal>
 82                   </goals>
 83                   <phase>package</phase>
 84                   <configuration>
 85                     <artifacts>
 86                       <artifact>
 87                         <file>${project.build.directory}/classes/configuration/initial/05-provider-config.xml</file>
 88                         <type>xml</type>
 89                         <classifier>config</classifier>
 90                       </artifact>
 91                     </artifacts>
 92                   </configuration>
 93                 </execution>
 94               </executions>
 95             </plugin>
 96           </plugins>
 97         </build>

Consumer Plugin

Developing consumer plugins is very similar to developing a provider plugin. For example, our consumer plugin only need to call the RPC implemented by the provider, which is also registered with SAL as shown in the previous section.

The YANG File

The consumer plugin only need to depend on Registry Service from the Config Subsystem. This is declared in its task-consumer-impl.yang as follow:

 34     augment "/config:modules/config:module/config:configuration" {
 35                 case task-consumer-impl {
 36                     when "/config:modules/config:module/config:type = 'task-consumer-impl'";
 37 
 38                     container rpc-registry {
 39                         uses config:service-ref {
 40                             refine type {
 41                                 mandatory true;
 42                                 config:required-identity mdsal:binding-rpc-registry;
 43                             }
 44                         }
 45                     }
 46                 }
 47             }

Generated Code

Now we generate code based on the YANG schema definition of the task-consumer configuration. Add the yang-maven-plugin in the maven pom.xml. Now run the mvn clean install command. This generates the module code and provides the entry point for our module.

The generated module file will be named TaskConsumerModule.java.

 30     public AutoCloseable createInstance() {
 31         TaskService service = getRpcRegistryDependency().getRpcService(TaskService.class);
 32 
 33         final TaskConsumerImpl consumerImpl = new TaskConsumerImpl(service);
 34 

This method is called when the plugin is loaded to karaf. At line 31, an instance of the provider implementation TaskService is returned to the consumer from the Registry Service. At line 33, implementation of the consumer get the RPC references passed.


Navigation
Toolbox