Skip to main content

Lambda, Map, and Filter in Python


Lambda: Lambda functions are known as the anonymous functions in Python. This kind of functions are declared without any name.

  • Can accept any number of argument. 
  • Restricted to only a single expression. 
  • Used only one-time.
  • Lambda function is used when you need a function for a short period of time. 
Basic syntax: 
lambda arguments : expression

Example:

Map: Map is a built-in function that takes a function object and iterable list (list/dictionary) as its parameter.
Basic syntax:
map (function_object, iterable_1, iterable_2, ... )

Few examples of Map in python:

Filter: It works similar way like map. The difference is it makes a new list from the element in the list that satisfy some condition. In case of of parameters, it takes a function and one iterable.
Basic syntax:
filter (function, iterable)

Normally used with lambda functions. Example is given below:



Comments

Post a Comment

Popular posts from this blog

DFS Performance Measurement

Completeness DFS is not complete, to convince yourself consider that our search start expanding the left subtree of the root for so long path (maybe infinite) when different choice near the root could lead to a solution, now suppose that the left subtree of the root has no solution, and it is unbounded, then the search will continue going deep infinitely, in this case , we say that DFS is not complete. Optimality  Consider the scenario that there is more than one goal node, and our search decided to first expand the left subtree of the root where there is a solution at a very deep level of this left subtree , in the same time the right subtree of the root has a solution near the root, here comes the non-optimality of DFS that it is not guaranteed that the first goal to find is the optimal one, so we conclude that DFS is not optimal. Time Complexity Consider a state space that is identical to that of BFS, with branching factor b, and we start the search fro...

Regularization in Deep Learning / Machine Learning - Prevent Overfitting

image source: mlexplained Overfittng happens in every machine learning (ML) problem.  Learning how to deal with overfitting is essential to mastering machine learning.  The fundamental issue in machine learning is the tension between optimization  and generalization. Optimization refers to the process of adjusting a model to get the  best performance possible on the training data (the learning in machine learning ),  whereas generalization refers to how well the trained model performs on data it has  never seen before . The goal of the game is to get good generalization, of course, but you don’t control generalization; you can only adjust the model based on its training  data.  The processing of fighting overfitting is a way  called regularization . [1].  How do you know whether a model is overfitting? The best initial method is to measure error on a training and test set. If you see a low error on the training set and...

Difference between a Singly LinkedList and Doubly LinkedList