Practical Work 03 Advanced Functions in Python
Practical Work 03 Advanced Functions in Python
Practical Work 03
Ex 01: Nested Function with State Tracking specific type. The decorator should take the
expected type as a parameter.
Create a function counter(start=0) that returns a
nested function. The nested function, when called, Instructions:
should increment the count starting from start
and return the updated count. Each instance of 1. Use *args and **kwargs to handle
counter should have its independent state. arbitrary arguments.
2. If an argument doesn't match the expected
Instructions: type, raise a TypeError.
Page 1 sur 2
Ex 06: Function Timer Decorator 3. Reduce the list to calculate the sum of the
squared even numbers.
Write a decorator time_it that measures and
prints the execution time of a function. Test it with Instructions:
a function that sums all numbers in a large list.
1. Use filter() to filter even numbers.
Instructions: 2. Use map() to square the numbers.
3. Use reduce() from functools to compute
1. Use time or timeit to measure the time. the sum.
2. The decorator should print how long the
function took to execute each time it’s Ex 10: Asynchronous Web Scraper Simulation
called.
Create an asynchronous program that simulates
Ex 07: Currying with Multiple Arguments scraping data from multiple websites using
asyncio. Each simulated website will have a
Problem: Create a curried function multiply that random response time to mimic real-world
takes three arguments and returns their product. conditions.
The function should allow partial application.
1. Create an asynchronous function
Instructions: fetch_data that simulates fetching data
from a website. The function should take a
1. Use nested lambdas to implement currying. website name (string) as an argument,
2. Each function call should only take one simulate a delay (using await
argument at a time. asyncio.sleep), and return the website
name and the simulated response time.
Ex 08: Recursion with Depth Tracking 2. Write an asynchronous function
fetch_all_data that takes a list of website
Write a recursive function deep_sum that calculates names and uses asyncio.gather() to
the sum of all elements in a nested list. The fetch data from all websites concurrently.
function should also return the maximum depth of 3. Update the fetch_data function ( or create
nesting encountered. a new one fetch_data_with_timeout to
include a timeout. If fetching data takes
Instructions: longer than 3 seconds, it should raise an
asyncio.TimeoutError. Use
1. Use recursion to traverse nested lists. asyncio.wait_for() to implement this.
2. Track the current depth of recursion and 4. Use the main function to run the program
update the maximum depth. and simulate fetching data from several
websites using asyncio.run().
Ex 09: Using filter, map, and reduce with async def main():
Lambda websites = ["site1.com",
"site2.com",
You are given a list of numbers. Perform the "site3.com","site4.com"]
following operations using lambda functions: await fetch_all_data(websites)
1. Filter the list to include only even numbers. # Run the main function
2. Map the list to get the square of each
asyncio.run(main())
number.
Page 2 sur 2