Show Menu

Data Flow with Python by


name of new list =
[expre­ssion for item in iterable
if condition == True]
squares =
[number**2 for number in numbers
if x < 5]
use ()
not []


for x, y in art_ga­lle­rie­s.i­tems():

# x with return keys,
y values

Set Functions to process Iterable Objects

Create sets from a list:
cookies _ eaten _ today = ['choc­olate chip' , 'peanut butter' , ...: 'chocolate chip' , 'oatmeal cream' , 'chocolate chip']
types _ of _ cookies _ eaten = set(co­okies _ eaten _ today)
Adding elements to a set:
.add() adds single elements
types _ of _ cookies _ eaten.a­dd­('b­isc­otti')
.update() merges in another set or list
types _ of _ cookies _ eaten.u­pd­ate­(co­okies _ we_will_ eeat)
.discard() safely removes an element from the set by value
types _ of _ cookies _ eaten.d­is­car­d('­bis­cotti')
Combining Sets:
.union() returns a set of all the unique values
cookies _jason _ ate.un­ion­(co­okies _ hugo _ ate)
.inter­sec­tion() method identies overla­pping data
cookies _jason _­ter­sec­tio­n(c­ookies _ hugo _ ate)
.diffe­rence() method identies data present in the set on which the method was used that is not in the arguments ( - )
cookies _jason _ ate.di­ffe­ren­ce(­cookies _ hugo _ ate)

Lambda Functions

Lambda­Fun­cti­onName =
arguments : expression
Define­Fun­ction =
lambda (param1, paramn: param1 ** paramn)

Using a Lamda Function inside another Function

# a function that always doubles the number you send in
def myfunc(n):
  return lambda a : a * n

mydoubler = myfunc(2)


Lambda with Map

# Create a list of strings: spells
spells = ["protego", "accio", "expecto patronum", "legilimens"]

# Use map() to apply a lambda function over spells: shout_spells
shout_spells = map(lambda item: item + '!!!' , spells)

# Convert shout_spells to a list: shout_spells_list
shout_spells_list = list(shout_spells)

# Print the result


# Import reduce from functools
from functools import reduce

# Create a list of strings: stark
stark = ['robb', 'sansa', 'arya', 'brandon', 'rickon']

# Use reduce() to apply a lambda function over stark: result
result = reduce(lambda item1, item2: item1 + item2, stark)

# Print the result


nums = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
print("Original list of integers:")
print("\Resuls less than 3 when divided by 2 from the said list:")
LessThan3 = list(filter(lambda x: x//2 < 3, nums))

Iterating through DataFrame Columns

 # Extract column from DataFrame: col
 col = df[col_name]
# Iterate over each column in DataFrame
for entry in col:

Iterating through DataFrames

# Define count_entries()
def count_entries(df, col_name='lang'):
    """Return a dictionary with counts of
    occurrences as value for each key."""

    # Initialize an empty dictionary: cols_count
    cols_count = {}

    # Add try block
    try :
        # Extract column from DataFrame: col
        col = df[col_name]
        # Iterate over each column in DataFrame
        for entry in col:
            # If entry is in cols_count, add 1
            if entry in cols_count.keys():
                cols_count[entry] += 1
            # Else add the entry to cols_count, set the value to 1
                cols_count[entry] = 1
        # Return the cols_count dictionary
        return cols_count

    # Add except block

# Call count_entries(): result1
result1 = count_entries(tweets_df, 'lang')

# Print result1

apply, applymap and map

to apply a function along the axis of a dataframe,
element wise operation across one or more rows and columns of a dataframe.
Substi­tutes the series value from the lookup dictio­nary, Series or a function
DFs and Series
Only Dataframes
Used only for a Series object
Applied to both series and elements
Applied to elements indivi­dually
Applied to series

Code Eamples of apply, applymap and map

df.apply(np.sum, axis=0)
-> col sums

df.apply(np.sum, axis=1)
-> row sums

df.applymap(lambda x: x**2)
-> Every df element squared

s = pd.Series(['cat', 'dog', np.nan, 'rabbit']){'cat': 'kitten', 'dog': 'puppy'})


No comments yet. Add yours below!

Add a Comment

Your Comment

Please enter your name.

    Please enter your email address

      Please enter your Comment.

          More Cheat Sheets by datamansam

          Apache Spark
          Core Cloud Concepts with AWS Cheat Sheet