Natural language processing with TensorFlow Teach language to machines using Python s deep learning library 1st Edition Thushan Ganegedara 2024 scribd download
Natural language processing with TensorFlow Teach language to machines using Python s deep learning library 1st Edition Thushan Ganegedara 2024 scribd download
com
https://textbookfull.com/product/sovereignty-war-and-the-global-state-
dylan-craig/
textbookfull.com
https://textbookfull.com/product/life-span-development-john-w-
santrock/
textbookfull.com
https://textbookfull.com/product/the-chemistry-of-nitrate-radical-
no3-and-dinitrogen-pentoxide-n2o5-in-beijing-1st-edition-haichao-wang/
textbookfull.com
https://textbookfull.com/product/building-a-passive-house-the-
architects-logbook-1st-edition-stefano-piraccini/
textbookfull.com
https://textbookfull.com/product/proceedings-of-international-
conference-on-thermofluids-kiit-thermo-2020-shripad-revankar/
textbookfull.com
Hannah Arendt's Theory of Political Action: Daimonic
Disclosure of the ‘Who' 1st Edition Trevor Tchir (Auth.)
https://textbookfull.com/product/hannah-arendts-theory-of-political-
action-daimonic-disclosure-of-the-who-1st-edition-trevor-tchir-auth/
textbookfull.com
Table of Contents
Natural Language Processing with TensorFlow
Why subscribe?
PacktPub.com
Contributors
About the author
About the reviewers
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
1. Introduction to Natural Language Processing
What is Natural Language Processing?
Tasks of Natural Language Processing
The traditional approach to Natural Language Processing
Understanding the traditional approach
Example – generating football game summaries
Drawbacks of the traditional approach
The deep learning approach to Natural Language Processing
History of deep learning
The current state of deep learning and NLP
Understanding a simple deep model – a Fully-Connected Neural
Network
The roadmap – beyond this chapter
Introduction to the technical tools
Description of the tools
Installing Python and scikit-learn
Installing Jupyter Notebook
Installing TensorFlow
Summary
2. Understanding TensorFlow
What is TensorFlow?
Getting started with TensorFlow
TensorFlow client in detail
TensorFlow architecture – what happens when you execute the
client?
Cafe Le TensorFlow – understanding TensorFlow with an
analogy
Inputs, variables, outputs, and operations
Defining inputs in TensorFlow
Feeding data with Python code
Preloading and storing data as tensors
Building an input pipeline
Defining variables in TensorFlow
Defining TensorFlow outputs
Defining TensorFlow operations
Comparison operations
Mathematical operations
Scatter and gather operations
Neural network-related operations
Nonlinear activations used by neural networks
The convolution operation
The pooling operation
Defining loss
Optimization of neural networks
The control flow operations
Reusing variables with scoping
Implementing our first neural network
Preparing the data
Defining the TensorFlow graph
Running the neural network
Summary
3. Word2vec – Learning Word Embeddings
What is a word representation or meaning?
Classical approaches to learning word representation
WordNet – using an external lexical knowledge base for
learning word representations
Tour of WordNet
Problems with WordNet
One-hot encoded representation
The TF-IDF method
Co-occurrence matrix
Word2vec – a neural network-based approach to learning word
representation
Exercise: is queen = king – he + she?
Designing a loss function for learning word embeddings
The skip-gram algorithm
From raw text to structured data
Learning the word embeddings with a neural network
Formulating a practical loss function
Efficiently approximating the loss function
Negative sampling of the softmax layer
Hierarchical softmax
Learning the hierarchy
Optimizing the learning model
Implementing skip-gram with TensorFlow
The Continuous Bag-of-Words algorithm
Implementing CBOW in TensorFlow
Summary
4. Advanced Word2vec
The original skip-gram algorithm
Implementing the original skip-gram algorithm
Comparing the original skip-gram with the improved skip-gram
Comparing skip-gram with CBOW
Performance comparison
Which is the winner, skip-gram or CBOW?
Extensions to the word embeddings algorithms
Using the unigram distribution for negative sampling
Implementing unigram-based negative sampling
Subsampling – probabilistically ignoring the common words
Implementing subsampling
Comparing the CBOW and its extensions
More recent algorithms extending skip-gram and CBOW
A limitation of the skip-gram algorithm
The structured skip-gram algorithm
The loss function
The continuous window model
GloVe – Global Vectors representation
Understanding GloVe
Implementing GloVe
Document classification with Word2vec
Dataset
Classifying documents with word embeddings
Implementation – learning word embeddings
Implementation – word embeddings to document embeddings
Document clustering and t-SNE visualization of embedded
documents
Inspecting several outliers
Implementation – clustering/classification of documents with K-
means
Summary
5. Sentence Classification with Convolutional Neural Networks
Introducing Convolution Neural Networks
CNN fundamentals
The power of Convolution Neural Networks
Understanding Convolution Neural Networks
Convolution operation
Standard convolution operation
Convolving with stride
Convolving with padding
Transposed convolution
Pooling operation
Max pooling
Max pooling with stride
Average pooling
Fully connected layers
Putting everything together
Exercise – image classification on MNIST with CNN
About the data
Implementing the CNN
Analyzing the predictions produced with a CNN
Using CNNs for sentence classification
CNN structure
Data transformation
The convolution operation
Pooling over time
Implementation – sentence classification with CNNs
Summary
6. Recurrent Neural Networks
Understanding Recurrent Neural Networks
The problem with feed-forward neural networks
Modeling with Recurrent Neural Networks
Technical description of a Recurrent Neural Network
Backpropagation Through Time
How backpropagation works
Why we cannot use BP directly for RNNs
Backpropagation Through Time – training RNNs
Truncated BPTT – training RNNs efficiently
Limitations of BPTT – vanishing and exploding gradients
Applications of RNNs
One-to-one RNNs
One-to-many RNNs
Many-to-one RNNs
Many-to-many RNNs
Generating text with RNNs
Defining hyperparameters
Unrolling the inputs over time for Truncated BPTT
Defining the validation dataset
Defining weights and biases
Defining state persisting variables
Calculating the hidden states and outputs with unrolled inputs
Calculating the loss
Resetting state at the beginning of a new segment of text
Calculating validation output
Calculating gradients and optimizing
Outputting a freshly generated chunk of text
Evaluating text results output from the RNN
Perplexity – measuring the quality of the text result
Recurrent Neural Networks with Context Features – RNNs with
longer memory
Technical description of the RNN-CF
Implementing the RNN-CF
Defining the RNN-CF hyperparameters
Defining input and output placeholders
Defining weights of the RNN-CF
Variables and operations for maintaining hidden and context
states
Calculating output
Calculating the loss
Calculating validation output
Computing test output
Computing the gradients and optimizing
Text generated with the RNN-CF
Summary
7. Long Short-Term Memory Networks
Understanding Long Short-Term Memory Networks
What is an LSTM?
LSTMs in more detail
How LSTMs differ from standard RNNs
How LSTMs solve the vanishing gradient problem
Improving LSTMs
Greedy sampling
Beam search
Using word vectors
Bidirectional LSTMs (BiLSTM)
Other variants of LSTMs
Peephole connections
Gated Recurrent Units
Summary
8. Applications of LSTM – Generating Text
Our data
About the dataset
Preprocessing data
Implementing an LSTM
Defining hyperparameters
Defining parameters
Defining an LSTM cell and its operations
Defining inputs and labels
Defining sequential calculations required to process sequential
data
Defining the optimizer
Decaying learning rate over time
Making predictions
Calculating perplexity (loss)
Resetting states
Greedy sampling to break unimodality
Generating new text
Example generated text
Comparing LSTMs to LSTMs with peephole connections and GRUs
Standard LSTM
Review
Example generated text
Gated Recurrent Units (GRUs)
Review
The code
Example generated text
LSTMs with peepholes
Review
The code
Example generated text
Training and validation perplexities over time
Improving LSTMs – beam search
Implementing beam search
Examples generated with beam search
Improving LSTMs – generating text with words instead of n-grams
The curse of dimensionality
Word2vec to the rescue
Generating text with Word2vec
Examples generated with LSTM-Word2vec and beam search
Perplexity over time
Using the TensorFlow RNN API
Summary
9. Applications of LSTM – Image Caption Generation
Getting to know the data
ILSVRC ImageNet dataset
The MS-COCO dataset
The machine learning pipeline for image caption generation
Extracting image features with CNNs
Implementation – loading weights and inferencing with VGG-
Building and updating variables
Preprocessing inputs
Inferring VGG-16
Extracting vectorized representations of images
Predicting class probabilities with VGG-16
Learning word embeddings
Preparing captions for feeding into LSTMs
Generating data for LSTMs
Defining the LSTM
Evaluating the results quantitatively
BLEU
ROUGE
METEOR
CIDEr
BLEU-4 over time for our model
Captions generated for test images
Using TensorFlow RNN API with pretrained GloVe word vectors
Loading GloVe word vectors
Cleaning data
Using pretrained embeddings with TensorFlow RNN API
Defining the pretrained embedding layer and the adaptation
layer
Defining the LSTM cell and softmax layer
Defining inputs and outputs
Processing images and text differently
Defining the LSTM output calculation
Defining the logits and predictions
Defining the sequence loss
Defining the optimizer
Summary
10. Sequence-to-Sequence Learning – Neural Machine Translation
Machine translation
A brief historical tour of machine translation
Rule-based translation
Statistical Machine Translation (SMT)
Neural Machine Translation (NMT)
Understanding Neural Machine Translation
Intuition behind NMT
NMT architecture
The embedding layer
The encoder
The context vector
The decoder
Preparing data for the NMT system
At training time
Reversing the source sentence
At testing time
Training the NMT
Inference with NMT
The BLEU score – evaluating the machine translation systems
Modified precision
Brevity penalty
The final BLEU score
Implementing an NMT from scratch – a German to English
translator
Introduction to data
Visit https://textbookfull.com
now to explore a rich
collection of eBooks, textbook
and enjoy exciting offers!
Preprocessing data
Learning word embeddings
Defining the encoder and the decoder
Defining the end-to-end output calculation
Some translation results
Training an NMT jointly with word embeddings
Maximizing matchings between the dataset vocabulary and the
pretrained embeddings
Defining the embeddings layer as a TensorFlow variable
Improving NMTs
Teacher forcing
Deep LSTMs
Attention
Breaking the context vector bottleneck
The attention mechanism in detail
Implementing the attention mechanism
Defining weights
Computing attention
Some translation results – NMT with attention
Visualizing attention for source and target sentences
Other applications of Seq2Seq models – chatbots
Training a chatbot
Evaluating chatbots – Turing test
Summary
11. Current Trends and the Future of Natural Language Processing
Current trends in NLP
Word embeddings
Region embedding
Input representation
Learning region embeddings
Implementation – region embeddings
Classification accuracy
Probabilistic word embedding
Ensemble embedding
Topic embedding
Neural Machine Translation (NMT)
Improving the attention mechanism
Hybrid MT models
Penetration into other research fields
Combining NLP with computer vision
Visual Question Answering (VQA)
Caption generation for images with attention
Reinforcement learning
Teaching agents to communicate using their own language
Dialogue agents with reinforcement learning
Generative Adversarial Networks for NLP
Towards Artificial General Intelligence
One Model to Learn Them All
A joint many-task model – growing a neural network for
multiple NLP tasks
First level – word-based tasks
Second level – syntactic tasks
Third level – semantic-level tasks
NLP for social media
Detecting rumors in social media
Detecting emotions in social media
Analyzing political framing in tweets
New tasks emerging
Detecting sarcasm
Language grounding
Skimming text with LSTMs
Newer machine learning models
Phased LSTM
Dilated Recurrent Neural Networks (DRNNs)
Summary
References
A. Mathematical Foundations and Advanced TensorFlow
Basic data structures
Scalar
Vectors
Matrices
Indexing of a matrix
Special types of matrices
Identity matrix
Diagonal matrix
Tensors
Tensor/matrix operations
Transpose
Multiplication
Element-wise multiplication
Inverse
Finding the matrix inverse – Singular Value Decomposition
(SVD)
Norms
Determinant
Probability
Random variables
Discrete random variables
Continuous random variables
The probability mass/density function
Conditional probability
Joint probability
Marginal probability
Bayes' rule
Introduction to Keras
Introduction to the TensorFlow seq2seq library
Defining embeddings for the encoder and decoder
Defining the encoder
Defining the decoder
Visualizing word embeddings with TensorBoard
Starting TensorBoard
Saving word embeddings and visualizing via TensorBoard
Summary
Index
Natural Language Processing
with TensorFlow
Natural Language Processing
with TensorFlow
Copyright © 2018 Packt Publishing
Every effort has been made in the preparation of this book to ensure
the accuracy of the information presented. However, the information
contained in this book is sold without warranty, either express or
implied. Neither the author, nor Packt Publishing or its dealers and
distributors, will be held liable for any damages caused or alleged to
have been caused directly or indirectly by this book.
Livery Place
35 Livery Street
ISBN 978-1-78847-831-1
www.packtpub.com
mapt.io
Mapt is an online digital library that gives you full access to over
5,000 books and videos, as well as industry leading tools to help you
plan your personal development and advance your career. For more
information, please visit our website.
Why subscribe?
Spend less time learning and more time coding with practical
eBooks and Videos from over 4,000 industry professionals
Learn better with Skill Plans built especially for you
Get a free eBook or video every month
Mapt is fully searchable
Copy and paste, print, and bookmark content
PacktPub.com
Did you know that Packt offers eBook versions of every book
published, with PDF and ePub files available? You can upgrade to the
eBook version at www.PacktPub.com and as a print book customer,
you are entitled to a discount on the eBook copy. Get in touch with
us at service@packtpub.com for more details.
So, what makes such NLP tasks so versatile and accurate for our
everyday tasks? The underpinning elements are "deep learning"
algorithms. Deep learning algorithms are essentially complex neural
networks that can map raw data to a desired output without
requiring any sort of task-specific feature engineering. This means
that you can provide a hotel review of a customer and the algorithm
can answer the question "How positive is the customer about this
hotel?", directly. Also, deep learning has already reached, and even
exceeded, human-level performance in a variety of NLP tasks (for
example, speech recognition and machine translation).
By reading this book, you will learn how to solve many interesting
NLP problems using deep learning. So, if you want to be an
influencer who changes the world, studying NLP is critical. These
tasks range from learning the semantics of words, to generating
fresh new stories, to performing language translation just by looking
at bilingual sentence pairs. All of the technical chapters are
accompanied by exercises, including step-by-step guidance for
readers to implement these systems. For all of the exercises in the
book, we will be using Python with TensorFlow—a popular
distributed computation library that makes implementing deep neural
networks very convenient.
Who this book is for
This book is for aspiring beginners who are seeking to transform the
world by leveraging linguistic data. This book will provide you with a
solid practical foundation for solving NLP tasks. In this book, we will
cover various aspects of NLP, focusing more on the practical
implementation than the theoretical foundation. Having sound
practical knowledge of solving various NLP tasks will help you to
have a smoother transition when learning the more advanced
theoretical aspects of these methods. In addition, a solid practical
understanding will help when performing more domain-specific
tuning of your algorithms, to get the most out of a particular
domain.
What this book covers
Chapter 1, Introduction to Natural Language Processing, embarks us
on our journey with a gentle introduction to NLP. In this chapter, we
will first look at the reasons we need NLP. Next, we will discuss some
of the common subtasks found in NLP. Thereafter, we will discuss the
two main eras of NLP—the traditional era and the deep learning era.
We will gain an understanding of the characteristics of the traditional
era by working through how a language modeling task might have
been solved with traditional algorithms. Then, we will discuss the
deep learning era, where deep learning algorithms are heavily
utilized for NLP. We will also discuss the main families of deep
learning algorithms. We will then discuss the fundamentals of one of
the most basic deep learning algorithms—a fully connected neural
network. We will conclude the chapter with a road map that provides
a brief introduction to the coming chapters.
When we reach Tipasa itself the great stones lie in heaps, in most
admired disorder. The ruins in their extent seem to indicate the
existence of a greater town than the historians admit Tipasa to have
been. It is said to have been founded by Claudius as a colony of
veterans, and to have contained 20,000 inhabitants. It is rich in
memories of the great Arian controversy which played so important a
part in the history of North Africa after the triumph of Christianity. In
A.D. 484 the Vandal king, Huneric, imposed an Arian bishop on the
Catholic inhabitants. A great part fled to Spain; those who remained
and refused to accept the heresy had their right arms lopped off and
their tongues cut out. It would seem that different branches of
Christendom have often been inclined to treat their erring brethren
with more severity than they meted out to the unregenerate
heathen. Perhaps the heathen has ever been a more likely convert.
The situation of Tipasa belies the opinion that the ancients had no
eye for natural scenery. It stood on a fair promontory sheltering from
the east a little cove which is protected from the west by the great
mountain mass of Djebel-Chénoua, which lies between Tipasa and
Cherchel. The country around is singularly picturesque, and the tout
ensemble very beautiful, even for this beautiful coast.
Thence we start for a run of fifty or sixty miles by the seaside road
to Algiers, a road which has been splendidly engineered, and is kept
for the most part in a condition beyond praise. In front of us
stretches the coast-line past the Bay of Algiers to Cap Matifou; on
our right are the wooded hills of the Sahel. Here and there the land
between the road and the sea is laid out in gardens formed in small
rectangular plots divided by hedges of a tall reed to break the force
of the wind. Even so the Dutch nurserymen erect screens to protect
their tulips on the wind-swept lowlands of Holland. In these
enclosures we particularly note frequent plantations of the tall
“silver” banana. And so in due time we reach Algiers, conscious of a
well-spent day.
Travel gives the death-blow to many illusions. If there is one tenet
to which British self-complacency has clung with more desperate
energy than another, it is that our people are the only successful
colonists. We are ready to admit that the German has hardly had a
fair chance. He is relegated for the present to desert tropical lands
which failed in the past to tempt even Portugal. That France owns
colonies of a different class we have been dimly aware, but the
oracles of the club and of the Press have consistently pictured to us
the French colonist as a miserable being who passes his time sipping
absinthe in a café, and longing for his return to la belle France.
Possibly in the purlieus of Algiers such a being might be discovered;
at any rate, he is certainly not more in evidence than the “remittance
men” and bar-loafers are in our own colonies. And a motor drive for
twenty or thirty miles through the rich plain which encircles Algiers
will send our long-cherished belief a-packing to the limbo of dead
British prejudices. We have recently discovered that the home-
staying French, at any rate, know something about practical
gardening, and the raising of vegetable crops for market; that their
scientific methods and untiring energy combine to get more out of
the ground than we do; and we have even been led to pocket our
pride and to import certain practical French gardeners, at a fancy
wage, to show us how the thing is done. In this we are only
following the example of our ancestors, who acquired most of their
arts and crafts from French and Flemish refugees. Yet it was quite a
shock when one of these new-comers, looking round him at the fair
fields of the home farm on a great estate in a southern county,
ingenuously remarked, “But why is not this country cultivated?”
Of this great plain between the sea and the mountains no such
question could be asked. Some corn is raised, and some vegetables,
such as artichokes, but most of it is devoted to the culture of the
vine. It is all in the highest state of cultivation, and not an inch is
wasted. The vines are planted in open fields, with the precision of
the hops of Kent. Now is the time of pruning, and they are all being
cut back to within a foot or so of the ground. To an eye accustomed
to the hill-side and rocky vineyards of the Rhine, of Italy, or of
Madeira, to the vines which in Southern Europe throw themselves in
reckless abandon over trellises and wayside trees, these flat fields,
which suggest turnips or beet, have a very unromantic appearance.
But it is easy to see that the cultivation is conducted on the most
scientific and business-like lines.
It was our privilege to be invited to visit a French gentleman and
his family at their residence about twenty miles from Algiers. Our
host has purchased a large tract of land, the whole of which he has
turned into a great vineyard. He has built a pleasant country house,
and filled it with treasures of Arab art, and the trophies of travel in
other lands. He has planted a garden of palms and sub-tropical
shrubs—a garden not kept up to the standard of English trimness,
but rich in shade, and pleasantly suggestive of a jungle. Not only are
his vines planted and pruned with mathematical precision, but all his
machinery for the extraction and treatment of the grape juice is of
the latest and most practical character. A long building lined with
huge vats gives an idea of the greatness of his undertaking, and is
designed to enable him to hold the produce of two vintages in the
event of a bad market:—a very important advantage to a producer.
There is nothing of the model, or pleasure, farm about the place; it
is all intensely practical. “It is an industry,” said our host; and indeed
it is; a fine example of industrial intelligence applied to agriculture.
The presence on the farm of two motor-cars and an aeroplane is
evidence that he is otherwise abreast of the movement.
It may be that our host is exceptionally gifted, both in enterprise
and resources, but at any rate his example must be of great value.
And the vistas all around of similar properties with pleasant houses
bowered in trees and gardens suggest that it is followed. It is
agreeable to learn that this industry meets its due reward. In 1910 it
has been exceptionally profitable. The chief buyers of Algerian wines
are the wine-shippers of Bordeaux and Macon, from whose cellars
they emerge as claret and Burgundy. The complete failure of the
vintage in Europe has caused a rise of fully fifty per cent in the price
of the produce of Algeria. In this happy climate, sure of its winter
rain and its summer sun, a failure of the vintage is unknown and
almost inconceivable. Viticulture has become the most important of
the industries in which Europeans in Algeria are engaged, and its
prosperity is of great importance to the Colony. Before the French
conquest, the use of wine being forbidden by the Koran, the vine
was only grown to a small extent for its fruit; the raisin sucré of
Khabylia was especially esteemed as a sweetmeat for dessert. The
first colonists made experiments in the production of wine, but with
insufficient knowledge and inadequate equipment. Wine-makers are
an aristocracy among agriculturists; a high intelligence and inherited
traditions count for much. The ravages of the phylloxera in France
created the opportunity of Algeria. The wine-growers of the South
thrown out of work were ready to emigrate, and the deficit in the
mother country’s production offered a great market for the Colony.
Since that time the industry has made steady progression. In 1850
2000 acres were under cultivation as vineyards; in 1905 about
450,000 acres. The production of wine, which amounted to 370,000
gallons in 1878, is now over 150,000,000 gallons. The price obtained
for wine exported is subject to very wide fluctuations. In 1903 the
100,000,000 gallons exported realized £4,000,000. In 1906
110,000,000 gallons realized only £1,600,000.
Algeria has managed to keep comparatively free from the
phylloxera; the provinces of Oran and Constantine, west and east,
have suffered somewhat, but the central province, Algiers, has so far
escaped. Energetic measures are taken to guard against the
extension of the plague, and owners of vines which it is found
necessary to destroy are compensated by the State. The policy of
the Government is now not to encourage the extension of the
vineyards, but to improve the quality of their produce. An effort
should be made to find other outlets than the French market, and
thus counteract the wide fluctuations in value which arise from its
varying demands. Some attempt has already been made to produce
rich dessert wines similar to those of Portugal and Madeira, of which
there is a considerable consumption in France, and it would appear
that there is no obstacle to its success. A delicious Muscat is already
made, which might conceivably obtain a great vogue.
IV—A GARDEN AND SOME BUILDINGS
The policy to be pursued was the first of them. The expedition had
achieved its punitive object, Algeria appeared to be poor and sterile,
and there was much to be said for abandoning it altogether. At the
other extreme was the proposal to attempt a complete and definite
conquest. A middle course was adopted,—to occupy only certain
important points on the coast and in the interior. It is easy to be wise
after the event; our own colonial experience is full of evidence of the
futility of half-measures; and we need not claim much perspicacity
for observing that France missed the golden opportunity for
occupying the country when the central Government, such as it was,
had been destroyed. But, for all the brave words of the truculent
admiral, she doubtless felt some diffidence in view of her declaration
to Europe, and the continued hostility of Great Britain was not
without its effect. France’s own political position, too, was in a very
disordered condition. On the 18th of August a revolution took place,
Louis Philippe was proclaimed King and Bourmont was recalled.
For the next ten years, from 1830 to 1840, what was known as the
policy of Restricted Occupation was pursued. Certain ports on the
coast were occupied—Oran, Bougie, Bône, etc.—and attempts were
made to bring the plain of the Metidja under French control by
placing garrisons in such towns as Medea and Blidah. The army of
occupation was much reduced, and Clauzel, the general in
command, endeavoured to raise native auxiliary troops, with small
success. He was, at any rate, a master of bombast. Having occupied
Blidah and ascended one of the passes of the Atlas, he addressed his
troops: “Soldats! les feux de nos bivouacs qui, des cimes de l’Atlas,
semblent dans ce moment se confondre avec la lumière des étoiles,
annoncent à l’Afrique la victoire que vous venez de remporter,” etc.
This pronouncement was followed by the withdrawal of the garrison
and a hasty retreat to Algiers. Early in 1831 Clauzel was recalled. His
successors, Berthezène, the duc de Rovigo and Voirol, essaying a
great undertaking with inadequate means, had no better fortune.
Under Voirol General Desmichels was sent to Oran with the object
of establishing order in the west. The tribes were in arms, and at
their head-quarters at Mascara had chosen as their general a
celebrated marabout, or holy man, named Mahi-ed-Dine, who,
having attacked Oran several times without success, resigned the
command to his son, Abd-el-Kader, then only twenty-four years of
age, but destined to become one of the greatest leaders of modern
times. He was, says Camille Rousset, “of middle height, but well
made, vigorous and untiring. He was the best among the best
horsemen in the world. Physical qualities are highly valued by the
Arabs; Abd-el-Kader had more—the qualities which make men
conquerors: intelligence, sagacity, strength of will, genius to
command. In eloquence he was the equal of the greatest orators,
and could bend crowds to his will. He spoke in serious and measured
tones, and was sparing of gesture, but his pale face was full of
animation, and under their long dark lashes his blue eyes darted
fire.” It may be remarked that the blue eyes point to a Berber, rather
than an Arab origin. Such was the man who for years to come was
to bid defiance to the French.
Their first dealings with him were unfortunate. Desmichels arrived
at Oran in the spring of 1833. Finding that he could make no
headway against Abd-el-Kader, who from his capital of Mascara was
preaching a holy war for the extermination of the infidels, he
concluded with him a treaty which enormously increased the Arab’s
authority. Abd-el-Kader was described in it as Emir; all practical
power was placed in his hands; and he was permitted to purchase
arms and ammunition in French towns. No mention was made of
French sovereignty. The treaty, though contrary to the instructions of
the French Government, was accepted by it in the belief that it
assured peace. Difficulties soon arose. Desmichels was recalled; his
successor, Trezel, at the head of a column of 1700 men, was
attacked by Abd-el-Kader in the marshes of La Macta, and defeated
with the loss of a third of his force.
The prestige of this victory brought many waverers to the Arab
leader’s flag. But France’s disaster brought home to her the
seriousness of the position, and in the end the defeat did more
towards the ultimate conquest than a victory would have done.
Clauzel, who had left Africa almost in disgrace in 1831, was sent
back in full command in 1835. He alone of the French generals had
exhibited any military qualities. His grandiose projects have been
justified by events. His main plan consisted in occupying Mascara
and Tlemçen in the west, Medea and Miliana in the centre, and
Constantine in the east. Of Tlemçen and Constantine he said, “Si
vous n’occupez pas ces deux Gibraltar de la Régence d’Alger, vous
n’en serez jamais les maîtres.” His failure was due to his attempt to
effect these objects with the inadequate means with which he was
supplied. He commenced by advancing against Abd-el-Kader, who
retired before him. Having occupied Mascara and Tlemçen, he
returned to Algiers, whereupon Tlemçen was promptly besieged by
the Arabs. At this point the great Frenchman, destined to overthrow
the Arab power and to conquer Algeria, appeared on the scene.
General Bugeaud was sent to command in the west. He was
personally opposed to conquest, and regarded French intervention in
Algeria not only as having been badly conducted, but as initially a
mistake. These views did not prevent him from putting his hand to
the plough. He began by revolutionizing the methods of warfare; in
spite of the opposition of his officers, he dispensed with heavy trains
of baggage and artillery, lightened the loads of the soldiers, and
carried their provisions on mules. Attacking Abd-el-Kader at La
Sikkah he inflicted on him a signal defeat, his native auxiliaries
pursuing the flying enemy with fury and slaughtering them in great
numbers. Bugeaud then returned to France.
Meantime Clauzel, having had some success in the neighbourhood
of Algiers, attacked Constantine, but was ignominiously repulsed,
and was recalled. The city fell the following year to General Valée. In
1837 Bugeaud was sent back to Oran, with instructions to make
terms with Abd-el-Kader on the basis of surrendering to him the
province of Oran in consideration of his recognizing the sovereignty
of France and paying tribute. The two leaders met and negotiated
the treaty of the Tafna. It was all in the Arab’s favour; the tribute
fixed was nominal, the sovereignty question ignored. In native eyes
Abd-el-Kader became a veritable monarch, his territory was assured
to him and he had leisure to gather his forces for a further struggle.
We must suppose either that Bugeaud’s private preferences carried
him away, or that the situation in the west was too desperate to
warrant his insisting on better terms. For two years peace reigned,
but in 1839 Abd-el-Kader proclaimed a holy war. Arabs and Khabyles
invaded the Metidja and burnt the farms of the French colonists.
Hostilities lasted for two years with no decisive result. In October,
1840, the Governor-General, Valée, was recalled, and Bugeaud was
sent out in supreme command to inaugurate a new policy.
EVENING PRAYER