Computer Lab-X Lab Manual
Computer Lab-X Lab Manual
MISSION
2. Possess knowledge and skills in the field of Computer Science & Engineering
and Information Technology for analyzing, designing and implementing
multifaceted engineering problems of any domain with innovative and
efficient approaches.
b. An ability to define a problem and provide a systematic solution with the help of conducting
experiments, as well as analyzing and interpreting the data.
e. An ability to use the techniques, skills, and modern engineering technologies tools, standard
processes necessary for practice as an IT professional.
g. An ability to analyze the local and global impact of computing on individuals, organizations and
society.
h. An ability to understand professional, ethical, legal, security and social issues and
responsibilities.
m. An ability to apply design and development principles in the construction of software systems
of varying complexity.
Sinhgad College of Engineering, Pune
Compliance
Document Control
Authors
Prerequisites:
Course Objectives:
2. To design applications for accessing smart devices and data generated through sensors
and services.
Course Outcomes:
1. Set up the Android environment and explain the Evolution of cellular networks.
3. Create applications for performing CURD SQLite database operations using Android.
4. Create the smart android applications using the data captured through sensors.
5. Implement the authentication protocols between two mobile devices for providing. Security.
6. Analyze the data collected through android sensors using any machine learning algorithm.
Guidelines:
This Computer Laboratory-X course has ubiquitous computing as a core subject. The problem
statements should framed based on first six assignments mentioned in the syllabus. The teachers will
frame the problem statements with due consideration that students have three hours to complete
that. The practical examination will comprise of implementation and related theory. All assignments to
be performed in Java 9.
Tools Required Android SDK / Android Studio, SQL Lite, Sensors, Arduino kit.
Laboratory Assignments
Assignment 1
Android development environment. Installing and setting up the environment. Hello world
application. Running the emulator. Inserting debug messages.
Assignment 2
Android UI Design: Design a User Interface using pre-built UI components such as structured layout
objects, UI controls and special interfaces such as dialogs, notifications, and menus. Also make this
UI attractive using Android graphics platform OpenGL.
Assignment 3
Android-database Connectivity: Create a SQLite Database for an Android Application and perform
CRUD (Create, Read, Update and Delete) database operations.
Assignment 4
Sensors for building Smart Applications: Use any sensors on the device to add rich location and
motion capabilities to your app, from GPS or network location to accelerometer, gyroscope,
temperature, barometer, and more.
Assignment 5
Develop a Smart Light System (Light that automatically switched on in evening and gets off in
morning) using open source Hardware platform like Arduino and some sensors (Light dependent
resistor) and actuator (An LED).
Assignment 6
Design and Develop a GUI for FAN regulator that uses Android platform.
Assignment 7
Develop an Android based FAN regulator using open source Hardware platform like NodeMcu and
actuator (a SERVO Motor).
Assignment 8
Android and Machine Learning: Mobile multimodal sensing- Draw inferences over the data coming
from phone’s sensing hardware (e.g. accelerometer, GPS, microphone), and processing these
samples with the help of machine learning. (Any Application: Healthcare, Smart City, Agriculture,
etc).
Assignment 9
Android API: Implement an application that uses Android APIs like Google Map, recording and
playing audio and video, using the built-in camera as an input device.
Assignment 10
Wireless Network: Develop an app for a rolling display program of news on computer display. The
input strings are supplied by the mobile phone/ by another computer connected through wireless
networks.
Assignment 11
Assignment 12
1. https://developer.android.com/
2. https://www.androidhive.info/2011/11/android-sqlite-database-tutorial/
3. https://developers.google.com/android/guides/api-client
4. https://developer.android.com/guide/topics/sensors/sensors_overview
INDEX
Sr. Page
Title
No. No.
Android development environment. Installing and setting up the
1 environment. Hello world application. Running the emulator. Inserting 14
debug messages.
Android UI Design: Design a User Interface using pre-built UI components
2 such as structured layout objects, UI controls and special interfaces such 31
as dialogs, notifications, and menus. Also, make this UI attractive using
Android graphics platform OpenGL.
Android-database Connectivity: Create a SQLite Database for an Android
3 Application and perform CRUD (Create, Read, Update and Delete) 52
database operations.
Sensors for building Smart Applications: Use any sensors on the device to
add rich location and motion capabilities to your app, from GPS or
4 69
network location to accelerometer, gyroscope, temperature,
barometer, and
more.
Develop a Smart Light System (Light that automatically switched on in
evening and gets off in morning) using open source Hardware platform
5 like Arduino and some sensors (Light dependent resistor) and actuator 70
(An LED).
6 Design and Develop a GUI for FAN regulator that uses Android platform. 74
Objective:
Theory:
Android Studio is Google's IDE for Android apps. Android Studio gives you an advanced code
editor and a set of app templates. In addition, it contains tools for development, debugging,
testing, and performance that make it faster and easier to develop apps. You can test your
apps with a large range of preconfigured emulators or on your own mobile device, and build
production APKs for publication.
You may need to install the Java Development Kit - Java 7 or better.
Install Android Studio
Android Studio is available for Windows, Mac, and Linux computers. The installation is similar
for all platforms. Any differences will be noted in the sections below.
1. Navigate to the Android developers site and follow the instructions to download and install
Android Studio.
2. After finishing the install, the Setup Wizard will download and install some additional
components.
3. When the download completes, Android Studio will start, and you are ready to create your
first project.
In this task, you will implement the "Hello World" app to verify that Android studio is correctly
installed and learn the basics of developing with Android Studio.
o Apps published to the Google Play Store must have a unique package name. Since
domains are unique, prepending your app's name with your or your company's
domain name is going to result in a unique package name.
o If you are not planning to publish your app, you can accept the default example
domain. Be aware that changing the package name of your app later is extra work.
6. Verify that the default Project location is where you want to store your Hello World app and
other Android Studio projects, or change it to your preferred directory. Click Next.
7. On the Target Android Devices screen, "Phone and Tablet" should be selected.
8. Click Next.
9. If your project requires additional components for your chosen target SDK, Android Studio
will install them automatically. Click Next.
10. Customize the Activity window. Every app needs at least one activity. An activity represents
a single screen with a user interface and Android Studio provides templates to help you get
started. For the Hello World project, choose the simplest template (as of this writing, the
"Empty Activity" project template is the simplest template) available.
11. It is a common practice to call your main activity MainActivity. This is not a requirement.
12. Make sure the Generate Layout file box is checked (if visible).
13. Make sure the Backwards Compatibility (App Compat) box is checked.
14. Leave the Layout Name as activity_main. It is customary to name layouts after the activity
they belong to. Accept the defaults and click Finish.
The Android Studio window should look similar to the following diagram:
You can look at the hierarchy of the files for your app in multiple ways.
1. Click on the Hello World folder to expand the hierarchy of files (1),
2. Click on Project (2).
3. Click on the Android menu (3).
4. Explore the different view options for your project.
In this practical, you will explore how the project files are organized in Android Studio.
These steps assume that your Hello World project starts out as shown in the diagram above.
In the Project > Android view of your previous task, there are three top-level folders below
your appfolder: manifests, java, and res.
1. Expand the manifests folder.
This folder contains AndroidManifest.xml. This file describes all of the components of your
Android app and is read by the Android run-time system when your program is executed.
2. Expand the java folder. All your Java language files are organized in this folder.
The java folder contains three subfolders:
o com.example.hello.helloworld (or the domain name you have specified): All the files for a
package are in a folder named after the package. For your Hello World application, there is
one package and it only contains MainActivity.java (the file extension may be omitted in the
Project view).
o com.example.hello.helloworld(androidTest): This folder is for your instrumented tests, and
starts out with a skeleton test file.
o com.example.hello.helloworld(test): This folder is for your unit tests and starts out with an
automatically created skeleton unit test file.
3. Expand the res folder. This folder contains all the resources for your app, including images,
layout files, strings, icons, and styling. It includes these subfolders:
o drawable. Store all your app's images in this folder.
o layout. Every activity has at least one layout file that describes the UI in XML. For Hello World,
this folder contains activity_main.xml.
o mipmap. Store your launcher icons in this folder. There is a sub-folder for each supported
screen density. Android uses the screen density, that is, the number of pixels per inch to
determine the required image resolution. Android groups all actual screen densities into
generalized densities, such as medium (mdpi), high (hdpi), or extra-extra-extra-high
(xxxhdpi). The ic_launcher.png folder contains the default launcher icons for all the densities
supported by your app.
o values. Instead of hardcoding values like strings, dimensions, and colors in your XML and Java
files, it is best practice to define them in their respective values file. This makes it easier to
change and be consistent across your app.
4. Expand the values subfolder within the res folder. It includes these subfolders:
o colors.xml. Shows the default colors for your chosen theme, and you can add your own colors
or change them based on your app's requirements.
o dimens.xml. Store the sizes of views and objects for different resolutions.
o strings.xml. Create resources for all your strings. This makes it easy to translate them to other
languages.
o styles.xml. All the styles for your app and theme go here. Styles help give your app a
consistent look for all UI elements.
3.2 The Gradle build system
Android Studio uses Gradle as its build system. As you progress through these practicals, you
will learn more about gradle and what you need to build and run your apps.
1. Expand the Gradle Scripts folder. This folder contains all the files needed by the build system.
2. Look for the build.gradle(Module:app) file. When you are adding app-specific dependencies,
such as using additional libraries, they go into this file.
In this task, you will use the Android Virtual Device (AVD) manager to create a virtual device
or emulator that simulates the configuration for a particular type of Android device.
Using the AVD Manager, you define the hardware characteristics of a device and its API level,
and save it as a virtual device configuration.
When you start the Android emulator, it reads a specified configuration and creates an
emulated device that behaves exactly like a physical version of that device , but it resides on
your computer .
Why: With virtual devices, you can test your apps on different devices (tablets, phones) with
different API levels to make sure it looks good and works for most users. You do not need to
depend on having a physical device available for app development.
In order to run an emulator on your computer, you have to create a configuration that
describes the virtual device.
1. In Android Studio, select Tools > Android > AVD Manager, or click the AVD Manager
The Select Hardware screen appears showing a list of preconfigured hardware devices. For
each device, the table shows its diagonal display size (Size), screen resolution in pixels
(Resolution), and pixel density (Density).
For the Nexus 5 device, the pixel density is xxhdpi, which means your app uses the launcher
icons in the xxhdpi folder of the mipmap folder. Likewise, your app will use layouts and
drawables from folders defined for that density as well.
There are many more versions available than shown in the recommended tab. Look at
the x86 Images and Other Images tabs to see them.
5. If a Download link is visible next to a system image version, it is not installed yet, and you
need to download it. If necessary, click the link to start the download, and click Finish when
it's done.
6. On System Image screen, choose a system image and click next.
7. Verify your configuration, and click Finish. (If the Your Android Devices AVD Manager
window stays open, you can go ahead and close it.)
In this task, you will finally run your Hello World app.
1. In Android Studio, select Run > Run app or click the Run icon in the toolbar.
2. In the Select Deployment Target window, under Available Emulators, select Nexus 5 API
23 and click OK.
The emulator starts and boots just like a physical device. Depending on the speed of your
computer, this may take a while. Your app builds, and once the emulator is ready, Android
Studio will upload the app to the emulator and run it.
You should see the Hello World app as shown in the following screenshot.
OUTPUT
Note: When testing on an emulator, it is a good practice to start it up once, at the very
beginning of your session. You should not close the emulator until you are done testing your
app, so that your app doesn't have to go through the boot process again.
In this practical, you will add log statements to your app, which are displayed in the logging
window of the Android Monitor.
Why: Log messages are a powerful debugging tool that you can use to check on values,
execution paths, and report exceptions.
1. Click the Android Monitor button at the bottom of Android Studio to open the Android
Monitor.
By default, this opens to the logcat tab, which displays information about your app as it is
running. If you add log statements to your app, they are printed here as well.
You can also monitor the Memory, CPU, GPU, and Network performance of your app from
the other tabs of the Android Monitor. This can be helpful for debugging and performance
tuning your code.
2. The default log level is Verbose. In the drop-down menu, change the log level to Debug.
Log statements that you add to your app code print a message specified by you in the logcat
tab of the Android Monitor. For example:
1. Open your Hello World app in Android studio, and open MainActivity file.
2. File > Settings > Editor > General >Auto Import (Mac: Android Studio > Preferences > Editor
> General >Auto Import). Select all check boxes and set Insert imports on paste to All.
Unambiguous imports are now added automatically to your files. Note the "add unambiguous
imports on the fly" option is important for some Android features such as NumberFormat. If
not checked, NumberFormat shows an error. Click on 'Apply' followed by clicking on the 'Ok'
button.
3. In the onCreate method, add the following log statement:
4. Log.d("MainActivity", "Hello World");
5. If the Android Monitor is not already open, click the Android Monitor tab at the bottom of
Android Studio to open it. (See screenshot.)
6. Make sure that the Log level in the Android Monitor logcat is set to Debug or Verbose
(default).
7. Run your app.
Solution Code:
package com.example.hello.helloworld;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
Every app includes an Android Manifest file ( AndroidManifest.xml).The manifest file contains
essential information about your app and presents this information to the Android runtime
system. Android must have this information before it can run any of your app's code.
In this practical you will find and read the AndroidManifest.xml file for the Hello World app.
1. Open your Hello World app in Android studio, and in the manifests folder,
openAndroidManifest.xml.
2. Read the file and consider what each line of code indicates. The code below is annotated to
give you some hints.
Annotated code:
<!-- XML version and character encoding -->
<?xml version="1.0" encoding="utf-8"?>
<!-- Required starting tag for the manifest -->
<manifest
<!-- Defines the android namespace. Do not change. -->
xmlns:android="http://schemas.android.com/apk/res/android"
<!-- Unique package name of your app. Do not change once app is
published. -->
package="com.example.hello.helloworld">
<!-- Required application tag -->
<application
<!-- Allow the application to be backed up and restored. –>
android:allowBackup="true"
<!-- Icon for the application as a whole,
and default icon for application components. –>
android:icon="@mipmap/ic_launcher"
<!-- User-readable for the application as a whole,
and default icon for application components. Notice that Android
Studio first shows the actual label "Hello World".
Click on it, and you will see that the code actually refers to a string
resource. Ctrl-click @string/app_name to see where the resource is
specified. This will be covered in a later practical . –>
android:label="@string/app_name"
<!-- Whether the app is willing to support right-to-left layouts.–>
android:supportsRtl="true"
<!-- Default theme for styling all activities. –>
android:theme="@style/AppTheme">
<!-- Declares an activity. One is required.
All activities must be declared,
otherwise the system cannot see and run them. –>
<activity
<!-- Name of the class that implements the activity;
subclass of Activity. –>
android:name=".MainActivity">
<!-- Specifies the intents that this activity can respond to.–>
<intent-filter>
<!-- The action and category together determine what
happens when the activity is launched. –>
<!-- Start activity as the main entry point.
Does not receive data. –>
<action android:name="android.intent.action.MAIN" />
<!-- Start this activity as a top-level activity in
the launcher . –>
<category android:name="android.intent.category.LAUNCHER" />
<!-- Closing tags –>
</intent-filter>
</activity>
</application>
</manifest>
Android Studio uses a build system called Gradle. Gradle does incremental builds, which allows
for shorter edit-test cycles.
1. In your project hierarchy, find Gradle Scripts and expand it. There several build.gradle files.
One with directives for your whole project, and one for each app module. The module for
your app is called "app". In the Project view, it is represented by the app folder at the top-
level of the Project view.
2. Open build.gradle (Module.app).
3. Read the file and learn what each line of code indicates.
Solution:
In this final task, you will run your app on a physical mobile device such as a phone or tablet.
Why: Your users will run your app on physical devices. You should always test your apps on
both virtual and physical devices.
To let Android Studio, communicate with your device, you must turn on USB Debugging on
your Android device. This is enabled in the Developer options settings of your device. Note this
is not the same as rooting your device.
On Android 4.2 and higher, the Developer options screen is hidden by default. To show
Developer options and enable USB Debugging:
1. On your device, open Settings > About phone and tap Build number seven times.
2. Return to the previous screen (Settings). Developer options appears at the bottom of the list.
Click Developer options.
3. Choose USB Debugging.
Now you can connect your device and run the app from Android Studio.
Android Studio should install and runs the app on your device.
Conclusion:
Thus we know how to Install and use the Android IDE. Also we understand the development
process for building Android apps. We have created an Android project from a basic app
template.
FAQs:-
Objective:
Theory:
Your app's user interface is everything that the user can see and interact with. Android
provides a variety of pre-built UI components such as structured layout objects and UI
controls that allow you to build the graphical user interface for your app. Android also
provides other UI modules for special interfaces such as dialogs, notifications, and menus.
Layouts
A layout defines the structure for a user interface in your app, such as in an activity. All
elements in the layout are built using a hierarchy of View and ViewGroup objects. A View
usually draws something the user can see and interact with. Whereas a ViewGroup is an
invisible container that defines the layout structure for View and other ViewGroup objects, as
shown in figure 1.
The View objects are usually called "widgets" and can be one of many subclasses, such
as Button or TextView. The ViewGroup objects are usually called "layouts" can be one of many
types that provide a different layout structure, such as LinearLayout or ConstraintLayout .
When you compile your app, each XML layout file is compiled into a View resource. You should
load the layout resource from your app code, in your Activity.onCreate() callback
implementation. Do so by calling setContentView(), passing it the reference to your layout
resource in the form of: R.layout.layout_file_name. For example, if your XML layout is saved
as main_layout.xml, you would load it for your Activity like so:
JAVA
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main_layout);
}
The onCreate() callback method in your Activity is called by the Android framework when your
Activity is launched (see the discussion about lifecycles, in the Activities document).
Attributes
Every View and ViewGroup object supports their own variety of XML attributes. Some
attributes are specific to a View object (for example, TextView supports the textSize attribute),
but these attributes are also inherited by any View objects that may extend this class. Some
are common to all View objects, because they are inherited from the root View class (like the
idattribute). And, other attributes are considered "layout parameters," which are attributes
that describe certain layout orientations of the View object, as defined by that object's parent
ViewGroup object.
ID
Any View object may have an integer ID associated with it, to uniquely identify the View within
the tree. When the app is compiled, this ID is referenced as an integer, but the ID is typically
assigned in the layout XML file as a string, in the id attribute. This is an XML attribute common
to all View objects (defined by the View class) and you will use it very often. The syntax for an
ID, inside an XML tag is:
android:id="@+id/my_button"
The at-symbol (@) at the beginning of the string indicates that the XML parser should parse
and expand the rest of the ID string and identify it as an ID resource. The plus-symbol (+)
means that this is a new resource name that must be created and added to our resources (in
the R.javafile). There are a number of other ID resources that are offered by the Android
framework. When referencing an Android resource ID, you do not need the plus-symbol,
but must add the android package namespace, like so:
android:id="@android:id/empty"
With the android package namespace in place, we're now referencing an ID from
the android.R resources class, rather than the local resources class.
In order to create views and reference them from the app, a common pattern is to:
<Button android:id="@+id/my_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/my_button_text"/>
2. Then create an instance of the view object and capture it from the layout (typically in
the onCreate() method):
JAVA
Button myButton = (Button) findViewById(R.id.my_button);
Defining IDs for view objects is important when creating a RelativeLayout. In a relative layout,
sibling views can define their layout relative to another sibling view, which is referenced by the
unique ID.
An ID need not be unique throughout the entire tree, but it should be unique within the part
of the tree you are searching (which may often be the entire tree, so it's best to be completely
unique when possible).
Layout Parameters
XML layout attributes named layout_something define layout parameters for the View that are
appropriate for the ViewGroup in which it resides.
Every ViewGroup class implements a nested class that extends ViewGroup.LayoutParams. This
subclass contains property types that define the size and position for each child view, as
appropriate for the view group. As you can see in figure 2, the parent view group defines
layout parameters for each child view (including the child view group).
Figure 2. Visualization of a view hierarchy with layout parameters associated with each view
Note that every LayoutParams subclass has its own syntax for setting values. Each child
element must define LayoutParams that are appropriate for its parent, though it may also
define different LayoutParams for its own children.
All view groups include a width and height ( layout_width and layout_height), and each view is
required to define them. Many LayoutParams also include optional margins and borders.
You can specify width and height with exact measurements, though you probably won't want
to do this often. More often, you will use one of these constants to set the width or height:
wrap_content tells your view to size itself to the dimensions required by its content.
match_parent tells your view to become as big as its parent view group will allow.
In general, specifying a layout width and height using absolute units such as pixels is not
recommended. Instead, using relative measurements such as density-independent pixel units
(dp), wrap_content, or match_parent, is a better approach, because it helps ensure that your
app will display properly across a variety of device screen sizes. The accepted measurement
types are defined in the Available Resources document.
Layout Position
The geometry of a view is that of a rectangle. A view has a location, expressed as a pair
of leftand top coordinates, and two dimensions, expressed as a width and a height. The unit for
location and dimensions is the pixel.
It is possible to retrieve the location of a view by invoking the methods getLeft() and getTop().
The former returns the left, or X, coordinate of the rectangle representing the view. The latter
returns the top, or Y, coordinate of the rectangle representing the view. These methods both
return the location of the view relative to its parent. For instance, when getLeft() returns 20,
that means the view is located 20 pixels to the right of the left edge of its direct parent.
The size of a view is expressed with a width and a height. A view actually possesses two pairs
of width and height values.
The first pair is known as measured width and measured height. These dimensions define how
big a view wants to be within its parent. The measured dimensions can be obtained by
calling getMeasuredWidth() and getMeasuredHeight().
The second pair is simply known as width and height, or sometimes drawing width and
drawing height. These dimensions define the actual size of the view on screen, at drawing time
and after layout. These values may, but do not have to, be different from the measured width
and height. The width and height can be obtained by calling getWidth() and getHeight().
To measure its dimensions, a view takes into account its padding. The padding is expressed in
pixels for the left, top, right and bottom parts of the view. Padding can be used to offset the
content of the view by a specific number of pixels. For instance, a left padding of 2 will push
the view's content by 2 pixels to the right of the left edge. Padding can be set using
the method and queried by
calling getPaddingLeft(), , getPaddingRight() and getPaddingBottom().
Even though a view can define a padding, it does not provide any support for margins.
However, view groups provide such a support. Refer
to ViewGroup andViewGroup.MarginLayoutParams for further information.
For more information about dimensions, see Dimension Values.
Common Layouts
Each subclass of the ViewGroup class provides a unique way to display the views you nest
within it. Below are some of the more common layout types that are built into the Android
platform.
Note: Although you can nest one or more layouts within another layout to achieve your UI
design, you should strive to keep your layout hierarchy as shallow as possible. Your layout
draws faster if it has fewer nested layouts (a wide view hierarchy is better than a deep view
Linear Layout
A layout that organizes its children into a single horizontal or vertical row. It creates a scrollbar
if the length of the window exceeds the length of the screen.
Relative Layout
Enables you to specify the location of child objects relative to each other (child A to the left of
child B) or to the parent (aligned to the top of the parent).
Web View
You can use built in vies to Design your activity as per requirement. You just drag and drop the
views and set their attributes.
Refer img to design and activity.
Dialogs
A dialog is a small window that prompts the user to make a decision or enter additional
information. A dialog does not fill the screen and is normally used for modal events that
require users to take an action before they can proceed.
Dialogs inform users about a task and can contain critical information, require decisions, or
involve multiple tasks.
Menus
Menus are a common user interface component in many types of applications. To provide a
familiar and consistent user experience, you should use the Menu APIs to present user actions
and other options in your activities.
Here, we are inflating the menu by calling the inflate() method of MenuInflater class. To
perform event handling on menu items, you need to override onOptionsItemSelected()
method of Activity class.
There are 3 types of menus in Android:
1. Option Menu:The options menu is the primary collection of menu items for an activity.
It's where you should place actions that have a overall impact on the app, such as
Search, Compose Email and Settings.
2. Context Menu: A context menu is a floating menu that appears when the user performs
a long-click on an element. It provides actions that affect the selected content or
context frame.
3. Pop-up Menu : A popup menu displays a list of items in a vertical list that is
anchored(sticked) to the view that invoked the menu. It's good for providing an
overflow of actions that relate to specific content or to provide options for a second
part of a command.
How to create a Menu?
For all menu types mentioned above, Android provides a standard XML format to define menu
items. Instead of building a menu in your activity's code, you should define a menu and all its
items in an XML menu resource. You can then inflate the menu resource i.e load the XML files
as a Menu object in your activity.
<menu>
It defines a Menu, which is a container for menu items. A <menu> element must be the
root node for the file and can hold one or more <item> and <group> elements.
<item>
It creates a MenuItem, which represents a single item in a menu. This element may contain
a nested <menu> element in order to create a submenu.
<group>
It is an optional, invisible container for <item> elements. It allows you to categorize menu
items so they share properties such as active state and visibility.
menu_file.xml
<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android">
<item android:id="@+id/i1"
android:title="item"
>
The <item> element supports several attributes you can use to define an item's appearance and
behavior. The items in the above menu include the following attributes:
android:id
A resource ID that's unique to the item, which allows the application to recognize the item
when the user selects it.
android:icon
A reference to a drawable to use as the item's icon.
android:title
<TextView
android:id="@+id/textView"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="51dp"
android:textSize="35sp"
android:text="This is new activity"
android:layout_alignParentTop="true"
android:layout_alignParentStart="true" />
<TextView
android:id="@+id/t1"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="I am context menu"
android:paddingBottom="30dp"
android:textAllCaps="true"
android:textSize="20sp"
android:layout_marginTop="11dp"
android:layout_below="@+id/textView"
android:layout_centerHorizontal="true" />
<Button
android:id="@+id/button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="pop"
android:text="I am Pop Menu"
android:textAllCaps="true"
android:layout_below="@+id/textView"
android:layout_centerHorizontal="true"
android:layout_marginTop="68dp" />
</RelativeLayout>
Main2Activity.java
package comjdjaydeeppatil.trialscoe;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.support.v7.widget.PopupMenu;
import android.view.ContextMenu;
import android.view.Menu;
import android.view.MenuInflater;
import android.view.MenuItem;
import android.view.View;
import android.widget.TextView;
import android.widget.Toast;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main2);
registerForContextMenu((TextView)findViewById(R.id.t1));
@Override
public boolean onCreateOptionsMenu(Menu menu) {
MenuInflater mi = getMenuInflater();
mi.inflate(R.menu.menu_file,menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()){
case R.id.i1:
Toast.makeText(this,"Cicked Main Menu",Toast.LENGTH_SHORT).show();
break;
case R.id.i2:
Toast.makeText(this,"I am sub-menu 1",Toast.LENGTH_SHORT).show();
break;
case R.id.i3:
Toast.makeText(this,"I am sub-menu 2",Toast.LENGTH_SHORT).show();
break;
}
return true;
}
@Override
public void onCreateContextMenu(ContextMenu menu, View v,
ContextMenu.ContextMenuInfo menuInfo) {
super.onCreateContextMenu(menu, v, menuInfo);
MenuInflater mi = getMenuInflater();
mi.inflate(R.menu.menu_file,menu);
}
@Override
public boolean onContextItemSelected(MenuItem item) {
switch (item.getItemId()){
case R.id.i1:
Toast.makeText(this,"Clicked Main menu",Toast.LENGTH_SHORT).show();
break;
case R.id.i2:
Toast.makeText(this,"I am sub-menu 1",Toast.LENGTH_SHORT).show();
break;
case R.id.i3:
Toast.makeText(this,"I am sub-menu 2",Toast.LENGTH_SHORT).show();
break;
}
return true;
{
switch (item.getItemId()){
case R.id.i1:
Toast.makeText(this,"Clicked Main menu",Toast.LENGTH_SHORT).show();
break;
case R.id.i2:
Toast.makeText(this,"I am sub-menu 1",Toast.LENGTH_SHORT).show();
break;
case R.id.i3:
Toast.makeText(this,"I am sub-menu 2",Toast.LENGTH_SHORT).show();
break;
}
return true;
}
}
Conclusion:
Thus we studied how to Make Simple UI design using inbuilt views. Also we Studied Menus and
Dialog box to make app more attractive.
FAQs:-
1. What’s the difference between an implicit and an explicit intent?
2. When should you use a Fragment, rather than an Activity?
3. You’re replacing one Fragment with another — how do you ensure that the user
can return to the previous Fragment, by pressing the Back button?
4. How would you create a multi-threaded Android app without using the Thread
class?
5. What is a ThreadPool? And is it more effective than using several separate Threads?
6. What is the relationship between the lifecycle of an AsyncTask and the lifecycle of
an Activity? What problems can this result in, and how can these problems be
avoided?
Assignment No.: 3
Back to Index
Lab. Assignment No – 3
Aim: Android-database Connectivity: Create a SQLite Database for an Android Application and
perform CRUD (Create, Read, Update and Delete) database operations.
Theory:
What is SQLite?
SQLite is an SQL Database. So in SQL database, we store data in tables. The tables are the
structure of storing data consisting of rows and columns.
What is CRUD?
As the heading tells you here, we are going to learn the CRUD operation in SQLite Database.
But what is CRUD? CRUD is nothing but an abbreviation for the basic operations that we
perform in any database. And the operations are
Create
Read
Update
Delete
Android SQLite
Android SQLite is a very lightweight database which comes with Android OS. Android SQLite
combines a clean SQL interface with a very small memory footprint and decent speed. For
Android, SQLite is “baked into” the Android runtime, so every Android application can create
its own SQLite databases.
Android SQLite native API is not JDBC, as JDBC might be too much overhead for a memory-
limited smartphone. Once a database is created successfully its located
in data/data//databases/ accessible from Android Device Monitor.
SQLite is a typical relational database, containing tables (which consists of rows and columns),
indexes etc. We can create our own tables to hold the data accordingly. This structure is
referred to as a schema.
Android SQLite SQLiteOpenHelper
Android has features available to handle changing database schemas, which mostly depend on
using the SQLiteOpenHelper class.
1. When the application runs the first time – At this point, we do not yet have a database.
So we will have to create the tables, indexes, starter data, and so on.
2. When the application is upgraded to a newer schema – Our database will still be on the
old schema from the older edition of the app. We will have option to alter the database
schema to match the needs of the rest of the app.
SQLiteOpenHelper wraps up these logic to create and upgrade a database as per our
specifications. For that we’ll need to create a custom subclass
of SQLiteOpenHelper implementing at least the following three methods.
1. Constructor: This takes the Context (e.g., an Activity), the name of the database, an
optional cursor factory (we’ll discuss this later), and an integer representing the version
of the database schema you are using (typically starting from 1 and increment later).
1. onCreate(SQLiteDatabase db) : It’s called when there is no database and the app needs
one. It passes us a SQLiteDatabase object, pointing to a newly-created database, that we
can populate with tables and initial data.
2. onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) : It’s called when the
schema version we need does not match the schema version of the database, It passes
us a SQLiteDatabase object and the old and new version numbers. Hence we can figure
out the best way to convert the database from the old schema to the new one.
We define a DBManager class to perform all database CRUD(Create, Read, Update and Delete)
operations.
Before performing any database operations like insert, update, delete records in a table, first
open the database connection by calling getWritableDatabase() method as shown below:
public DBManager open() throws SQLException
database = dbHelper.getWritableDatabase();
return this;
{ dbHelper.close();
The following code snippet shows how to insert a new record in the android SQLite database.
ContentValues();
contentValue.put(DatabaseHelper.SUBJECT, name);
contentValue.put(DatabaseHelper.DESC, desc);
Content Values creates an empty set of values using the given initial size. We’ll discuss the other
instance values when we jump into the coding part.
Updating Record in Android SQLite database table
contentValues.put(DatabaseHelper.SUBJECT, name);
contentValues.put(DatabaseHelper.DESC, desc);
null); return
i;
A Cursor represents the entire result set of the query. Once the query is fetched a call
to cursor.moveToFirst()is made. Calling moveToFirst() does two things:
It allows us to test whether the query returned an empty set (by testing the return value)
It moves the cursor to the first result (when the set is not empty)
if (cursor != null)
{ cursor.moveToFirst(
);
return cursor;
Implementation:
Step 2 – Add components in the main activity as shown in the picture below.
activity_main.xml
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content
"
android:text="Name"
android:id="@+id/textView"
android:layout_alignParentTop="true
"
android:layout_marginTop="44dp" />
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content
"
android:text="SurName"
android:id="@+id/textView2"
android:layout_below="@+id/textView
"
android:layout_marginTop="44dp" />
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content
"
android:text="Marks"
android:id="@+id/textView3"
android:layout_below="@+id/textView2
"
android:layout_marginTop="44dp" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content
" android:text="View All"
android:id="@+id/button2"
android:layout_marginTop="46dp"
android:layout_below="@+id/button"
android:layout_alignStart="@+id/button" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Update"
android:id="@+id/button_update"
android:layout_below="@+id/button"
android:layout_alignStart="@+id/button" /
>
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Add"
android:id="@+id/button"
android:layout_marginTop="13dp"
android:layout_below="@+id/textView3"
android:layout_centerHorizontal="true" /
>
<EditText
android:layout_width="match_parent"
android:layout_height="wrap_content
" android:id="@+id/editText2"
android:layout_alignBaseline="@+id/textView2
"
android:layout_alignBottom="@+id/textView2"
android:layout_toRightOf="@+id/textView2"
android:layout_toEndOf="@+id/textView2"
android:layout_marginLeft="18dp"
android:layout_marginStart="18dp" />
<EditText
android:layout_width="match_parent"
android:layout_height="wrap_content
" android:id="@+id/editText3"
android:layout_alignBaseline="@+id/textView3
"
android:layout_alignBottom="@+id/textView3"
android:layout_alignParentRight="true"
android:layout_alignParentEnd="true"
android:layout_alignLeft="@+id/editText"
android:layout_alignStart="@+id/editText" />
<EditText
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/editTextId"
android:layout_alignParentTop="true"
android:layout_toRightOf="@+id/textView2"
android:layout_toEndOf="@+id/textView2" /
>
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content
"
android:text="ID"
android:id="@+id/textView4
"
android:layout_alignBaseline="@+id/editTextId
"
android:layout_alignBottom="@+id/editTextId"
android:layout_alignRight="@+id/textView"
android:layout_alignEnd="@+id/textView" />
<EditText
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/editText"
android:layout_below="@+id/textView4"
android:layout_alignLeft="@+id/editTextId"
android:layout_alignStart="@+id/editTextId"></EditText>
</RelativeLayout>
Step 3 – Now create a new Java classes called Student.java and MyHelper.java.
You can see Project Structure of above image to know where to add this java classes.
Student.java
package com.myapplication;
/**
* Created by jd on 22-Jan-19.
*/
MyHelper.java
package com.myapplication;
/**
* Created by jaydeep on 27-Sep-17.
*/
import
android.content.ContentValues;
import android.content.Context;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import
android.database.sqlite.SQLiteOpenHelper; public
class MyHelper extends SQLiteOpenHelper
{
public static final String DATABASE_NAME= "Student.db"; // DB Name
public static final String TABLE_NAME = "student_table"; // Table Name
public static final String COL_1 = "ID"; // Column 1.
public static final String COl_2 = "Name"; // Column 2.
public static final String COL_3 = "SurName"; // Column
3. public static final String COL_4 = "Marks"; // Column 4.
@Override
public void onCreate(SQLiteDatabase db)
{
db.execSQL("Create Table " + TABLE_NAME + " (ID INTEGER PRIMARY
KEY AUTOINCREMENT , NAME TEXT, SURNAME TEXT, MARKS
INTEGER) ");
}
@Override
public void onUpgrade(SQLiteDatabase sqLiteDatabase, int i, int i1)
{
}
public boolean insertData(String name, String surname, String marks)
{
SQLiteDatabase db = this.getWritableDatabase(); // it will create DB & Table.
ContentValues contentValues = new ContentValues(); // It is used to put the values in the
Column.
contentValues.put(COl_2,name);
contentValues.put(COL_3,surname)
; contentValues.put(COL_4,marks);
long result =
db.insert(TABLE_NAME,null,contentValues); if (result ==
-1)
return
false; else
return true;
}
MainActivity.java
package com.myapplication;
import
android.database.Cursor;
import android.os.Bundle;
import android.support.v7.app.AlertDialog;
import
android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.widget.Button;
import
android.widget.EditText;
import android.widget.Toast;
addData();
viewAll();
updateData()
;
btnviewUpdate.setOnClickListener(new View.OnClickListener()
{ @Override
public void onClick(View view)
{
boolean isUpdate =
mh.updateData(editTextId.getText().toString(),editName.getText().toString(),editSurname.
g etText().toString(),editMarks.getText().toString());
if(isUpdate== true)
Toast.makeText(MainActivity.this,"Data Updated",Toast.LENGTH_LONG).show();
else
Toast.makeText(MainActivity.this, " Data Not Updated "
,Toast.LENGTH_LONG).show();
}
});
}
@Override
public void onClick(View view) {
boolean isInserted = mh.insertData(editName.getText().toString()
,editSurname.getText().toString(),
editMarks.getText().toString()); if(isInserted == true)
Toast.makeText(MainActivity.this,"Data Inserted",Toast.LENGTH_LONG).show();
else
Toast.makeText(MainActivity.this, " Data Not Inserted "
,Toast.LENGTH_LONG).show();
}
});
Thus we implement SQLite Application to Add and View and Update records.
FAQs:-
Theory:
Overview:
Have you ever had a situation where you visited one place and you have some task to
do next time you visit the place? While travelling by bus/train, have you need ATM machine
and hospital based on location?
The application “Advanced GPS location finder to identify hospital location and ATM
location” solves all these problems. It offers below services
Retrieves the user’s current geological coordinates.
Once user is near the location, the location will be searching nearest places can be
viewed.
User can edit/delete/update/enable/disable the nearest places.
User can see the locations on Map to find out how far he is from the expected location.
MODULE DESCRIPTION
Google Map & Searching Location
Google Places search
User Interface
Database
2. Google Places Search: Google provider is provides the World Wide search options for
Google places. By Using the Google search keyword to find the each and every Google Map
Locations in the World. Where the search option is done by adding Google Places JAR’s
like,
google-api-client-1.10.3-beta.jar
google-api-client-android2-1.10.3-beta.jar
google-http-client-1.10.3-beta.jar
google-oauth-client-1.10.1-beta.jar
google-http-client-android2-1.10.3-beta.jar
gson-2.1.jar
guava-11.0.1.jar
jackson-core-asl-1.9.4.jar
jsr305-1.3.9.jar
protobuf-java-2.2.0.jar
https://maps.googleapis.com/maps/api/place/search/json?
These all are used to search the Google Places in Google Map.
3. User Interface: The user interface is the important thing in android applications. In this
application, the user interface is designed by using the Android XML. For user interactions
and easy to handle application many of the User interface designs are used in this
applications.
4. The SQLite Database: SQLite Database is one of Main part this application,
because the searched location details are stored by using the SQLite Database in the
Android. For a feature reference, the location details are stored in SQLite database.
SOFTWARE SPECIFICATIONS
Eclipse IDE for Java Developers - Eclipse 3.6.2 (Helios) or greater
Eclipse JDT plug-in (included in most Eclipse IDE packages)
JDK 6 (JRE alone is not sufficient)
Android Development Tools plug-in (recommended)
OPERATING SYSTEMS
Windows XP (32-bit)
Vista (32- or 64-bit)
Windows 7 (32- or 64-bit)
HARDWARE SPECIFICATIOS
Hard disk - 40 GB
Processor - Pentium IV 2.4 GHz
Ram - 1 GB
Conclusion:
Thus, we have studied and implemented Sensors for building Smart Applications: Use any
sensors on the device to add rich location and motion capabilities to your app, from GPS or
network location.
FAQ’s
Back to Index
Lab. Assignment No – 5
AIM :
Design a smart light system which operates controls LED light automatically switched on in
evening and gets off in morning using Arduino, LED and LDR interface.
OBJECTIVES:
To design a smart light system which operates controls LED light automatically switched on in
evening and gets off in morning by programmable control of a dark resistance and bright
resistance of LDR (light dependent resistance) using Arduino IDE and UNO board.
THEORY :
An LDR is a component that has a (variable) resistance that changes with the light intensity
that falls upon it. See fig 1 and Fig 2 for LDR view and symbol. This allows them to be used in
light sensing circuits.
Fig 1: A typical LDR Fig 2 : LDR Circuit Symbol
The most common type of LDR has a resistance that falls with an increase in the light intensity
falling upon the device (as shown in the image above). The resistance of an LDR may typically
for example one can observe the following resistances (this also depends on size of LDR and
may vary in your case):
Daylight = 5000Ω
Dark = 20000000Ω
You can therefore see that there is a large variation between these figures. If you plotted this
variation on a graph you would get something similar to that shown by the graph shown above.
Light dependent resistance (LDR) values shows different resistance values in dark and bright
light see Fig 4 which indicate same thing using graph. This can be observed just by measuring
LDR resistance on multimeter by exposing LDR to bright light and recording the value of
resistance and other case by holding it in dark. These two thresholds are deciding dark and
bright resistance. So if we want to put on LED in evening then on port where LED is connected
need to put ON if dark resistance value or greater than value is available on analog port where
LDR is connected. In else part or by excessively specifying bright resistance value one can put
off LED by wring vice vers statement for LED light port. One cam also observes the values on
serial port by appropriate statements. The connections are shown in fig 5.
1. First connect LDR to any analog port (out of any six) of Arduino UNO board.
2. Connect LED to any digital pin (out of any eleven) of Arduino UNO board.
3. Observe values on serial port
4. Write statements in of Arduino IDE as per specifying dark and bright resitance value to put
ON/OFF LED.
5. Hold LDR in palm and cover it by all fingers; this time LED need to glow. Now expose LDR to
light and see; LED need to turn off automatically.
6. By looking room light bright resistance; by turning ON and OFF of bulb / Tube light one can see
same effect on automatic LDE ON /OFF.
CONCLUSION : This way we make can use the open source platforms like Arduino Uno and its
IDE to make build smart lighting system based using properties of LDR sensor that controls LED
ON/OFF operation automatically by turning it ON in dark and turning it OFF in bright light.
FAQ :
1. What is LDR?
2. How LDR functions?
3. What is NTC and PTC and what is its significance in real world? Is it possible to manage this
from software to do vice versa operation of LDR?
4. Is it possible to control different 2 different LEDs connected to different digital pins of Arduino
on the basis of different dark and bright conditions of LDR? If ‘yes’ how? If ‘no’ why?
Assignment No.: 06
Back to Index
Lab. Assignment No – 6
AIM :
Design an Android based FAN regulator which operates controls Dynamic Fan Control using
NodeMUC ESP8266(wireless transceiver).
OBJECTIVES :
To design an Android based FAN regulator which operates controls Dynamic Fan Control
automatically using Android APP(BLYNK) by programmable control in Widget Box selecting
Slider using Arduino IDE and NodeMUC board.
THEORY :
A motor is an electrical machine which converts electrical energy into mechanical energy.
The principle of working of a DC motor is that "whenever a current carrying conductor is
placed in a magnetic field, it experiences a mechanical force". The direction of this force is
given by Fleming's left hand rule and its magnitude is given by F = BIL. Where, B = magnetic
flux density, I = current and L = length of the conductor within the magnetic field.
Fig 1 : Working of DC Motor
Fleming's left hand rule: If we stretch the first finger, second finger and thumb of our left hand
to be perpendicular to each other AND direction of magnetic field is represented by the first
finger, direction of the current is represented by second finger then the thumb represents the
direction of the force experienced by the current carrying conductor.
Above animation helps in understanding the
Working principle of a DC motor. When armature windings are connected to a DC supply,
current sets up in the winding. Magnetic field may be provided by field winding
(electromagnetism) or by using permanent magnets. In this case, current carrying armature
conductors experience force due to the magnetic field, according to the principle stated above.
Commutator is made segmented to achieve unidirectional torque. Otherwise, the direction of
force would have reversed every time when the direction of movement of conductor is
reversed the magnetic field.
INPUT : DC MOTOR( 1 qty.)
One NodeMUC Board
Wires for connection
OUTPUT : Actual FAN speed is is control by adjusting the slider in BLYNK app.
Connection : As shown in fig 2 the Interfacing DC motor on NodeMUC . Blynk App screen are
shown in Fig 3 ; which is used to control FAN or any output connected to NodeMUC remotely .
FAQ :
1] Differentiate NodeMUC Vs Arduino UNO.
2] What is purpose of Rx and Tx pin in NodeMUC ?
3] Is it possible to control 230V; 50Hz operated Fan ? How ?
4] How relay operate?
Assignment No.: 07
.Back to Index
Lab. Assignment No – 7
AIM:
Design a sophisticated system, which acquires data from multiple sensors and transmits it
through a wireless module via Arduino, and will be received by another Arduino and given to
the PC for analysis using Machine learning.
OBJECTIVES:
To Acquire temperature values from 4 LM35 sensors and 1 DHT11 temperature+humidity
sensor in real time and transmit that data to the remote Arduino wirelessly using NRF24l01+
transceiver modules and further give that data to PC via serial communication and analyze it
using Machine Learning in Python.
THEORY:
A] LM35 Sensor:
Applications:
Power Supplies
Battery Management
HVAC
Appliances
B] DHT11 Sensor:
DHT11 Temperature & Humidity Sensor features a temperature & humidity sensor complex
with a calibrated digital signal output. By using the exclusive digital-signal-acquisition
technique and temperature & humidity sensing technology, it ensures high reliability and
excellent long-term stability. This sensor includes a resistive-type humidity measurement
component and an NTC temperature measurement component, and connects to a high
performance 8-bit microcontroller, offering excellent quality, fast response, anti-interference
ability and cost- effectiveness.
Features of DHT11 are as follows:
Operating Voltage: 3.5V to 5.5V
Operating current: 0.3mA (measuring) 60uA (standby)
Output: Serial data
Temperature Range: 0°C to 50°C
Humidity Range: 20% to 90%
Resolution: Temperature and Humidity both are 16-bit
Accuracy: ±1°C and ±1%
Applications:
Measure temperature and humidity
Local Weather station
Automatic climate control
Environment monitoring
The nRF24L01+ is a single chip 2.4GHz transceiver with an embedded baseband protocol
engine, suitable for ultra-low power wireless applications. The nRF24L01+ is designed for
operations in the world-wide ISM frequency band at 2.4-2.4835GHz. To design a radio system
with the nRF24L01+, you simply need an MCU and a few external passive components. One
can operate and configure the NRF24L01+ through a Serial Peripheral Interface (SPI). The
register map, which is accessible through the SPI, contains all configuration registers in the
nRF24L01+ and is accessible in all operation modes of the chip.
Features of nRF24L01+ are as follows:
World Wide 2.4GHz ISM band Operation
1 to 32 bytes dynamic payload size per packet
Integrated voltage regulator
1.9V to 3.6V Supply Range
250kbps, 1 and 2 Mbps air data rate
For detail description please refer the datasheet -
https://www.sparkfun.com/datasheets/Components/SMD/nRF24L01Pluss_Preliminary_Produ
ct_Specification_v1_0.pdf
D] Machine Learning:
Machine learning is an application of artificial intelligence (AI) that provides systems the ability
to automatically learn and improve from experience without being explicitly programmed.
Machine learning focuses on the development of computer programs that can access data and
use it learn for themselves.
The process of learning begins with observations or data, such as examples, direct experience,
or instruction, in order to look for patterns in data and make better decisions in the future
based on the examples that we provide. The primary aim is to allow the computers learn
automatically without human intervention or assistance and adjust actions accordingly.
Machine learning enables analysis of massive quantities of data. While it generally delivers
faster, more accurate results in order to identify profitable opportunities or dangerous risks, it
may also require additional time and resources to train it properly. Combining machine
learning with AI and cognitive technologies can make it even more effective in processing large
volumes of information.
Support Vector Machine:
A Support Vector Machine models the situation by creating a feature space, which is a finite-
dimensional vector space, each dimension of which represents a "feature" of a particular
object. In the context of spam or document classification, each "feature" is the prevalence or
importance of a particular word. The goal of the SVM is to train a model that assigns new
unseen objects into a particular category. It achieves this by creating a linear partition of the
feature space into two categories. Based on the features in the new unseen objects (e.g.
documents/emails), it places an object "above" or "below" the separation plane, leading to a
categorisation (e.g. spam or non-spam). This makes it an example of a non-probabilistic linear
classifier. It is non-probabilistic, because the features in the new objects fully determine its
location in feature space and there is no stochastic element involved.
Procedure:
1. Transmitter - Connect the 4 LM35 sensors to the A0-A3 pins of the Arduino. The DHT11 to the
D7 pin and the Nrf24L01+ to the respective pins from D9 to D13 as given in the official library.
2. Receiver – Connect the nRF24L01+ exactly same as that of the transmitter.
3. Download the libraries for dht11 and nRF24L01+ from the ‘manage libraries’ of the Arduino IDE.
4. Upload the respective code into the transmitter and receiver Arduino Boards s and check the
output on the serial monitor if the Arduino IDE.
5. The same output is given to the machine learning algorithm in Python which has been have
trained to classify the atmosphere in real time.
6. We use a simple Support Vector Classifier, which is the widely used algorithm in machine
learning. It can classify the atmosphere with a high accuracy.
CONCLUSION :
This way we make can use of the open source platforms like Arduino and Python to make our
own data and use for machine learning applications in real time without any limitations. The
sensors can be varied and different machine learning training algorithms can be implemented
for higher accuracy and performance.
FAQ :
1. 1What is the drawback of LM35 sensor?
2. 2What is the limitation of DHT11 sensor?
3. 3Why use nRF24L01+, when we have Bluetooth and XBEE modules? What are the
advantages and disadvantages?
4. 4What is machine learning? Why do we use it solve real world problems? Compare with
classical approach.
Assignment No.: 11
Back to Index
Lab. Assignment No – 11
AIM:
Design a system, which connect hardware to Android smartphone with unique identifier security
i.e. Authentication Token.
OBJECTIVES:
To design a system, which connect Arduino microcontroller to Android smartphone with unique
authentication identifier security.
Authentication
Authentication is the process of proving that people and organizations are who or what they
claim to be. For wireless networks, this is often done at two layers: the network layer and the
application layer. The network requires the user to be authenticated before that person is
granted access. This can be done implicitly, based on the device or modem being used, or
explicitly, using a variety of mechanisms. At the application layer, authentication is important
at two levels: the client and the enterprise server. To gain access to enterprise data, the client
has to prove to the server that it is what it says it is. At the same time, before a client allows an
outside server to connect to it—for example, to push some content—the server has to
authenticate itself to the client application.
Data Integrity
Data integrity is assurance that the data in question has not been altered or corrupted in any
way during the transmission from the sender to the receiver. This can be accomplished by
using data encryption in combination with a cryptographic checksum or Message
Authentication Code (MAC). This information is encoded into the message itself by applying an
algorithm to the message. When recipients receive the message, they compute the MAC and
compare it with the MAC encoded in the message to see if the codes are the same. If they are,
recipients can be confident that the message has not been tampered with. If the codes are
different, recipients can discard the data as inaccurate.
Confidentiality
Confidentiality is one of the most important aspects of security, and certainly the most talked
about. Confidentiality is about maintaining data privacy, making sure it cannot be viewed by
unwanted parties. Most often, when people are worried about the security of a system, they
are concerned that sensitive information, such as a credit card number or health records, can
be viewed by parties with malicious intent. The most common way of preventing this intrusion
is by encrypting the data. This process involves encoding the content of a message into a form
that is unreadable by anyone other than the intended recipient.
Authorization
Authorization is the process of determining the user’s level of access—whether a user has the
right to perform certain actions. Authorization is often closely tied to authentication. Once a
user is authenticated, the system can determine what that party is permitted to do. Access
control lists (ACLs) are often used to help determine this. For example, all users may have
read- only access to a set of data, while the administrator, or another trusted source, may also
have write access to the data.
Nonrepudiation
Nonrepudiation is about making parties accountable for transactions in which they have
participated. It involves identifying the parties in such a way that they cannot at a later time
deny their involvement in the transaction. In essence, it means that both the sender and the
recipient of a message can prove to a third party that the sender did indeed send the message
and the recipient received the identical message. To accomplish this, each transaction has to
be signed with a digital signature that can be verified and time-stamped by a trusted third
party.
THEORY:
1. Blynk
Blynk was designed for the Internet of Things. It can control hardware remotely, it can
display sensor data, and it can store data, visualize it and do many other cool things.
There are three major components in the platform:
a) Blynk App - allows to you create amazing interfaces for your projects using various widgets we
provide.
b) Blynk Server - responsible for all the communications between the smartphone and hardware.
You can use our Blynk Cloud or run your private Blynk server locally. Its open-source could
easily handle thousands of devices and can even be launched on a Raspberry Pi.
c) Blynk Libraries - for all the popular hardware platforms - enable communication with the
server and process all the incoming and out coming commands.
Now imagine every time you press a Button in the Blynk app, the message travels to
space the Blynk Cloud, where it magically finds its way to your hardware. It works the same in
the opposite direction and everything happens in a Blynk of an eye as shown in fig 1.
Fig 1: Blynk server authentication security
Features:
Similar API & UI for all supported hardware & devices
Connection to the cloud using:
- Wi-Fi
- Bluetooth and BLE
- Ethernet
- USB (Serial)
- GSM
Set of easy-to-use Widgets
Direct pin manipulation with no code writing
Easy to integrate and add new functionality using virtual pins
History data monitoring via Super Chart widget
Device-to-Device communication using Bridge Widget
Sending emails, tweets, push notifications, etc.
Light is kind of environment sensors that allows you to measure level of light (measures the
ambient light level (illumination) in lux). In phones it is used to control screen brightness.
In order to accept data from it you need to:
BLYNK_WRITE(V1)
{
int lx = param.asInt();
}
Light doesn’t work in background.
3. Experimental setup:
CONCLUSION : This way we make can use the open source platforms like NodeMCU and its
IDE to make build smart lighting system based using android smartphone sensor that controls
LED ON/OFF operation automatically by turning it ON in dark and turning it OFF in bright light.
FAQ:
5. What is Blynk server?
6. What are feature of Blynk server how it provide security?
7. What is Authentication token?
8. Is it possible to control onboard digital and analog pins directly without using virtual variable? If
‘yes’ how? If ‘no’ why?
Assignment No. : 12
Back to Index
Lab. Assignment No – 12
Theory:
INTRODUCTION
Wireless communication is the transfer of information over a distance without the use of
enhanced electrical conductors or "wires”. The distances involved may be short (a few meters
as in television remote control) or long (thousands or millions of kilometers for radio
communications). When the context is clear, the term is often shortened to "wireless". It
encompasses various types of fixed, mobile, and portable two-way radios, cellular telephones,
Personal Digital Assistants (PDAs), and wireless networking.
In 1895, Guglielmo Marconi opened the way for modern wireless communications by
transmitting the three-dot Morse code for the letter ‘S’ over a distance of three kilometers
using electromagnetic waves. From this beginning, wireless communications has developed
into a key element of modern society. Wireless communications have some special
characteristics that have motivated specialized studies. First, wireless communications relies
on a scarce resource – namely, radio spectrum state. In order to foster the development of
wireless communications (including telephony and Broadcasting) those assets were privatized.
Second, use of spectrum for wireless communications required the development of key
complementary technologies; especially those that allowed higher frequencies to be utilized
more efficiently. Finally, because of its special nature, the efficient use of spectrum required
the coordinated development of standards.
The term is used to describe modern wireless connections such as those in cellular networks
and wireless broadband internet, mainly using radio waves. The Mobile wireless industry has
started its technology creation, revolution & evolution since early 1970s. In the past few
decades, mobile wireless technologies have been classified according to their generation,
which largely specifies the type of services and the data transfer speeds of each class of
technologies.
i. ZERO GENERATION TECHNOLOGY (0G – 0.5G)
0G refers to pre-cellular mobile telephony technology in 1970s. These mobile telephones were
usually mounted in cars or trucks, though briefcase models were also made. Mobile radio
telephone systems preceded modern cellular mobile telephony technology. Since they were
the predecessors of the first generation of cellular telephones, these systems are sometimes
referred to as 0G (zero generation) systems. Technologies used in 0G systems included PTT
(Push to Talk), MTS (Mobile Telephone System), IMTS (Improved Mobile Telephone Service),
AMTS
(Advanced Mobile Telephone System), OLT (Norwegian for Offentlig Landmobil Telefoni, Public
Land Mobile Telephony) and MTD . 0.5G is a group of technologies with improved feature than
the basic 0G technologies. These early mobile telephone systems can be distinguished from
earlier closed radiotelephone systems in that they were available as a commercial service that
was part of the public switched telephone network, with their own telephone numbers, rather
than part of a closed network such as a police radio or taxi dispatch system. These mobile
telephones were usually mounted in cars or trucks,
though briefcase models were also made. Typically, the transceiver (transmitter-receiver) was
mounted in the vehicle trunk and attached to the "head" (dial, display, and handset) mounted
near the driver seat. They were sold through various outlets, including two-way radio dealers.
The primary users were loggers, construction foremen, realtors, and celebrities, for basic voice
communication.
Early examples for this technology are:
1. The Autoradiopuhelin (ARP) launched in 1971 in Finland as the country's first public
commercial mobile phone network.
2. The B-Netz launched 1972 in Germany as the countries second public commercial mobile
phone network (but the first one that did not require human operators anymore to connect
calls).
ii. FIRST GENERATION TECHNOLOGY (1G)
In 1980 the mobile cellular era had started, and since then mobile communications have
undergone significant changes and experienced enormous growth. First-generation mobile
systems used analog transmission for speech services. In 1979, the first cellular system in the
world became operational by Nippon Telephone and Telegraph (NTT) in Tokyo, Japan. Two
years later, the cellular epoch reached Europe. The two most popular analogue systems were
Nordic Mobile Telephones (NMT) and Total Access Communication Systems (TACS). Other than
NMT and TACS, some other analog systems were also introduced in 1980s across the Europe.
All of these systems offered handover and roaming capabilities but the cellular networks were
unable to interoperate between countries. This was one of the inevitable disadvantages of
first- generation mobile networks.
In the United States, the Advanced Mobile Phone System (AMPS) was launched in 1982. The
system was allocated a 40-MHz bandwidth within the 800 to 900 MHz frequency range by the
Federal Communications Commission (FCC) for AMPS. In 1988, an additional 10 MHz
bandwidth, called Expanded Spectrum (ES) was allocated to AMPS. It was first deployed in
Chicago, with a service area of 2100 square miles2. AMPS offered 832 channels, with a data
rate of 10 kbps. Although Omni directional antennas were used in the earlier AMPS
implementation, it was realized that using directional antennas would yield better cell reuse. In
fact, the smallest reuse factor that would fulfill the 18db signal-to-interference ratio (SIR) using
120-degree directional antennas was found to be 7.
Hence, a 7-cell reuse pattern was adopted for AMPS. Transmissions from the base stations to
mobiles occur over the forward channel using frequencies between 869-894 MHz. The reverse
channel is used for transmissions from mobiles to base station, using frequencies between
824- 849 MHz. AMPS and TACS use the frequency modulation (FM) technique for radio
transmission. Traffic is multiplexed onto an FDMA (frequency division multiple access) system.
iii. SECOND GENERATION TECHNOLOGY (2G - 2.75G)
By the late 1980s, it was clear that the first generation cellular systems—based on analog
signaling techniques—were becoming obsolete. Advances in integrated circuit (IC) technology
had made digital communications not only practical, but, actually more economical than
analog technology. Digital communication enables advanced source coding techniques to be
utilized. This allows the spectrum to be used much more efficiently and, thereby, reduces the
amount of bandwidth required for voice and video. In addition, we can use error correction
coding to provide a degree of resistance to interference and fading that plagues analog
systems, and to allow a lower transmit power. Also, with digital systems, control information is
more efficiently handled, which facilitates network control. Second generation digital systems
can be classified by their multiple access techniques as either Frequency Division Multiple
Access (FDMA), Time Division Multiple Access (TDMA) or Code Division Multiple Access
(CDMA).
In FDMA, the radio spectrum is divided into a set of frequency slots and each user is assigned a
separate frequency to transmit. In TDMA, several users transmit at the same frequency but in
different time slots. CDMA uses the principle of direct sequence spread-spectrum: the signals
are modulated with high bandwidth spreading waveforms called signature waveforms or
codes. Although the users transmit at both the same frequency and time, separation of signals
is achieved because the signature waveforms have very low cross correlation.
In practice, the TDMA and CDMA schemes are combined with FDMA. Thus the term “TDMA” is
used to describe systems that first divide the channel into frequency slots and then divide each
frequency slot into multiple time slots. Similarly, CDMA is actually a hybrid of CDMA and FDMA
where the channel is first divided into frequency slots. Each slot is shared by multiple users
who each use a different code.
2.5G – GPRS (General Packet Radio Service)
2.5G, which stands for "second and a half generation," is a cellular wireless technology
developed in between its predecessor, 2G, and its successor, 3G. The term "second and a half
generation" is used to describe 2G-systems that have implemented a packet switched domain
in addition to the circuit switched domain. "2.5G" is an informal term, invented solely for
marketing purposes, unlike "2G" or "3G" which are officially defined standards based on those
defined by the International Telecommunication (ITU). GPRS could provide data rates from 56
kbit/s up to 115 kbit/s. It can be used for services such as Wireless Application Protocol (WAP)
access, Multimedia Messaging Service (MMS), and for Internet communication services such as
email and World Wide Web access. GPRS data transfer is typically charged per megabyte of
traffic transferred, while data communication via traditional circuit switching is billed per
minute of connection time, independent of whether the user actually is utilizing the capacity
or is in an idle state.
2.5G networks may support services such as WAP, MMS, SMS mobile games, and search and
directory.
2.75 – EDGE (Enhanced Data rates for GSM Evolution)
EDGE (EGPRS) is an abbreviation for Enhanced Data rates for GSM Evolution, is a digital mobile
phone technology which acts as a bolt-on enhancement to 2G and 2.5G General Packet Radio
Service (GPRS) networks. This technology works in GSM networks. EDGE is a superset to GPRS
3G refers to the third generation of mobile telephony (that is, cellular) technology. The third
generation, as the name suggests, follows two earlier generations. The first generation (1G)
began in the early 80's with commercial deployment of Advanced Mobile Phone Service
(AMPS) cellular networks. Early AMPS networks used Frequency Division Multiplexing Access
(FDMA) to carry analog voice over channels in the 800 MHz frequency band.
3G technologies enable network operators to offer users a wider range of more advanced
services while achieving greater network capacity through improved spectral efficiency.
Services include wide area wireless voice telephony, video calls, and broadband wireless data,
all in a mobile environment. Additional features also include HSPA data transmission
capabilities able to deliver speeds up to 14.4Mbit/s on the downlink and 5.8Mbit/s on the
uplink. Spectral efficiency or spectrum efficiency refers to the amount of information that can
be transmitted over a given bandwidth in a specific digital communication system. High-Speed
Packet Access (HSPA) is a collection of mobile telephony protocols that extend and improve
the performance of existing UMTS protocols.
3G technologies make use of TDMA and CDMA. 3G (Third Generation Technology)
technologies make use of value added services like mobile television, GPS (global positioning
system) and video conferencing. The basic feature of 3G Technology is fast data transfer rates.
3G technology is much flexible, because it is able to support the 5 major radio technologies.
These radio technologies operate under CDMA, TDMA and FDMA.
3.5G – HSDPA (High-Speed Downlink Packet Access)
High-Speed Downlink Packet Access(HSDPA) is a mobile telephony protocol, also called 3.5G
(or "3½G"), which provides a smooth evolutionary path for UMTS-based 3G networks allowing
for higher data transfer speeds. HSDPA is a packet-based data service in W-CDMA downlink
with data transmission up to 8-10 Mbit/s (and 20 Mbit/s for MIMO systems) over a 5MHz
bandwidth in WCDMA downlink. HSDPA implementations includes Adaptive Modulation and
Coding (AMC), Multiple-Input Multiple-Output (MIMO), Hybrid Automatic Request (HARQ),
fast cell search, and advanced receiver design.
3.75G – HSUPA (High-Speed Uplink Packet Access)
The 3.75G refer to the technologies beyond the well-defined 3G wireless/mobile technologies.
High Speed Uplink Packet Access (HSUPA) is a UMTS / WCDMA uplink evolution technology.
The HSUPA mobile telecommunications technology is directly related to HSDPA and the two
are complimentary to one another. HSUPA will enhance advanced person-to-person data
applications with higher and symmetric data rates, like mobile e-mail and real-time person-to
person gaming. Traditional useful applications along with many consumer applications will
benefit from enhanced uplink speed. HSUPA will initially boost the UMTS / WCDMA uplink up
to 1.4Mbps and in later releases up to 5.8Mbps.
v. FOURTH GENERATION (4G)
4G refers to the fourth generation of cellular wireless standards. It is a successor to 3G and 2G
families of standards. The nomenclature of the generations generally refers to a change in the
fundamental nature of the service, non-backwards compatible transmission technology and
new frequency bands. The first was the move from 1981 analogue (1G) to digital (2G)
transmission in 1992. This was followed, in 2002, by 3G multi-media support, spread spectrum
transmission and at least 200 Kbit/s, soon expected to be followed by 4G, which refers to all-IP
packet-switched networks, mobile ultra-broadband (gigabit speed) access and multi-carrier
transmission. Pre-4G technologies such as mobile WiMAX and first-release 3G Long Term
Evolution (LTE) have been available on the market since 2006and 2009 respectively.
It is basically the extension in the 3G technology with more bandwidth and services offers in
the 3G. The expectation for the 4G technology is basically the high quality audio/video
streaming over end to end Internet Protocol. If the Internet Protocol (IP) multimedia sub-
system movement achieves what it going to do, nothing of this possibly will matter. WiMAX or
mobile structural design will become progressively more translucent, and therefore the
acceptance of several architectures by a particular network operator ever more common.
Some of the companies trying 4G communication at 100 Mbps for mobile users and up to 1
Gbps over fixed stations. They planned on publicly launching their first commercial wireless
network around2010. As far as other competitor’s mobile communication companies working
on 4G technology even more quickly. Sprint Nextel was planned to launch WiMAX over 4 G
broadband mobile networks in United States. Some of the other developed countries like
United Kingdom stated a plan to sale via auction of 4G wireless frequencies couple of years
back. The word “MAGIC” also refers to 4G wireless technology which stands for Mobile
multimedia, Any-where, Global mobility solutions over, integrated wireless and Customized
services.
5G (5th generation mobile networks or 5th generation wireless systems) is a name used in
some research papers and projects to denote the next major phase of mobile
telecommunications standards beyond the upcoming 4G standards, which are expected to be
finalized between approximately 2011 and 2013. Currently 5G is not a term officially used for
any particular specification or in any official document yet made public by telecommunication
companies or standardization bodies such as 3GPP, WiMAX Forum or ITU-R. New 3GPP
standard releases beyond 4G and LTE Advanced are in progress, but not considered as new
mobile generations. 5G Technology stands for 5th Generation Mobile technology. 5G
technology has changed the means to use cell phones within very high bandwidth. User never
experienced ever before such
a high value technology. Nowadays mobile users have much awareness of the cell phone
(mobile) technology. The 5G technologies include all type of advanced features which makes
5G technology most powerful and in huge demand in near future.
The gigantic array of innovative technology being built into new cell phones is stunning. 5G
technology which is on hand held phone offering more power and features than at least 1000
lunar modules. A user can also hook their 5G technology cell phone with their Laptop to get
broadband internet access. 5G technology including camera, MP3 recording, video player,
large phone memory, dialing speed, audio player and much more you never imagine. For
children rocking fun Bluetooth technology and Piconets has become in market.
5G technology going to be a new mobile revolution in mobile market. Through 5G technology
now you can use worldwide cellular phones and this technology also strike the china mobile
market and a user being proficient to get access to Germany phone as a local phone. With the
coming out of cell phone alike to PDA now your whole office in your finger tips or in your
phone. 5G technology has extraordinary data capabilities and has ability to tie together
unrestricted call volumes and infinite data broadcast within latest mobile operating system. 5G
technology has a bright future because it can handle best technologies and offer priceless
handset to their customers. May be in coming days 5G technology takes over the world
market.
5G Technologies have an extraordinary capability to support Software and Consultancy. The
Router and switch technology used in 5G network providing high connectivity. The 5G
technology distributes internet access to nodes within the building and can be deployed with
union of wired or wireless network connections. The current trend of 5G technology has a
glowing future. GSM (Global System for Mobile Communication)
GSM or global system for mobile communication is a digital cellular system. It was originated in
Finland Europe. However now it is throughout the world. GSM (Global System for Mobile
Communication) accounts for 80% of total mobile phone technologies market. There are over
more than 3 billion users of GSM (Global System for Mobile Communication) now. GSM
technology got its popularity, when people used it to talk to their friends and relatives. The use
of GSM (Global System for Mobile Communication) is possible due to the SIM (subscribers
identity module) GSM (Global System for Mobile Communication) is easy to use, affordable
and helps you carry your cell phone everywhere. GSM (Global System for Mobile
Communication) is a 2G technology. There are many frequency ranges for GSM (Global System
for Mobile Communication) however 2G is the most used frequency.
GSM (Global System for Mobile Communication) offers moderate security. It allows for
encryption between the end user and the service base station. The use of various forms of
cryptographic modules is part of GSM technology.
EDGE Technology (Enhanced Data Rates for GSM Evolution Technology)
EDGE technology is an extended version of GSM. It allows the clear and fast transmission of
data and information. EDGE is also termed as IMT-SC or single carrier. EDGE technology was
invented and introduced by Cingular, which is now known as AT& T. EDGE is radio technology
and is a part of third generation technologies. EDGE technology is preferred over GSM due to
its flexibility to carry packet switch data and circuit switch data. EDGE is termed as backward
compatible technology; backward compatible technology is that technology which allows the
input generation of older devices. EDGE technology is supported by third generation
partnership projects; this association helps and supports the up gradation of GSM, EDGE
technology and
other related technologies. The frequency, capability and performance of EDGE technology is
more than the 2G GSM Technology. EDGE technology holds more sophisticated coding and
transmission of data. EDGE technology can help you connect to the internet.
This technology supports the packet switching system. EDGE develops a broadband internet
connection for its users. EDGE technology helps its users to exploit the multimedia services.
EDGE technology do not involve the expense of additional hardware and software
technologies. It only requires the base station to install EDGE technology transceiver. EDGE
technology is an improved technology which almost supports all the network vendors. All they
have to do is to upgrade their stations. EDGE technology has its edge because it can make use
of both switch circuit technology and packet circuit technology. EDGE technology is also
believed to support EGPRS or in other words enhanced general packet radio service. It is
important to have GPRS network if one wants to use EDGE technology because EDGE cannot
work without GSM Technology. Therefore it is an extended version of GSM Technology.
Conclusion:
FAQs:-