id
int64 0
25.6k
| text
stringlengths 0
4.59k
|
---|---|
5,900 | okwe've now added six records to our database table let' run an sql query to see how we didcurs execute('select from people'curs fetchall([('bob''dev' )('sue''mus' )('ann''mus' )('tom''mgr' )('kim''adm' )('pat''dev' )run an sql select statement with cursor object to grab all rows and call the cursor' fetchall to retrieve them they come back to our script as sequence of sequences in this moduleit' list of tuples--the outer list represents the result tablethe nested tuples are that table' rowsand the nested tuple' contents are the column data because it' all python dataonce we get the query resultwe process it with normal python code for exampleto make the display bit more coherentloop through the query' result as usualcurs execute('select from people'for row in curs fetchall()print(row('bob''dev' ('sue''mus' ('ann''mus' ('tom''mgr' ('kim''adm' ('pat''dev' tuple unpacking comes in handy in loops heretooto pick out column values as we go here' simple formatted display of two of the columnsvaluescurs execute('select from people'for (namejobpayin curs fetchall()print(name':'paybob sue ann tom kim pat because the query result is sequencewe can use python' powerful sequence and iteration tools to process it for instanceto select just the name column valueswe can run more specific sql query and get list of tuplescurs execute('select name from people'names curs fetchall(names [('bob',)('sue',)('ann',)('tom',)('kim',)('pat',)or we can use python list comprehension to pick out the fields we want--by using python codewe have more control over the data' content and format databases and persistence |
5,901 | names [rec[ for rec in curs fetchall()names ['bob''sue''ann''tom''kim''pat'the fetchall call we've used so far fetches the entire query result table all at onceas single sequence (an empty sequence comes backif the result is emptythat' convenientbut it may be slow enough to block the caller temporarily for large result tables or generate substantial network traffic if the server is running remotely (something could easily require parallel thread in guito avoid such bottleneckwe can also grab just one rowor bunch of rowsat time with fetchone and fetchmany the fetchone call returns the next result row or none false value at the end of the tablecurs execute('select from people'while truerow curs fetchone(if not rowbreak print(row('bob''dev' ('sue''mus' ('ann''mus' ('tom''mgr' ('kim''adm' ('pat''dev' the fetchmany call returns sequence of rows from the resultbut not the entire tableyou can specify how many rows to grab each time with parameter or rely on the default as given by the cursor' arraysize attribute each call gets at most that many more rows from the result or an empty sequence at the end of the tablecurs execute('select from people'while truerows curs fetchmany(if not rowsbreak for row in rowsprint(row('bob''dev' ('sue''mus' ('ann''mus' ('tom''mgr' ('kim''adm' ('pat''dev' size= optional argument for this module at leastthe result table is exhausted after fetchone or fetchmany returns false value the db api says that fetchall returns "all (remainingrows,so you generally need to call execute again to regenerate results before fetching new datacurs fetchone(curs fetchmany([sql database interfaces |
5,902 | [naturallywe can do more than fetch an entire tablethe full power of the sql language is at your disposal in pythoncurs execute('select namejob from people where pay 'curs fetchall([('sue''mus')('tom''mgr')('pat''dev')the last query fetches name and job fields for people who earn more than $ , the next is similarbut passes in the selection value as parameter and orders the result tablequery 'select namejob from people where pay >order by namecurs execute(query[ ]for row in curs fetchall()print(row('ann''mus'('sue''mus'('tom''mgr'('pat''dev'running updates cursor objects also are used to submit sql update statements to the database server-updatesdeletesand inserts we've already seen the insert statement at work let' start new session to perform some other kinds of updateswe begin with the same data we had in the prior sessionc:\pp \dbase\sqlpython import sqlite conn sqlite connect('dbase 'curs conn cursor(curs execute('select from people'curs fetchall([('bob''dev' )('sue''mus' )('ann''mus' )('tom''mgr' )('kim''adm' )('pat''dev' )the sql update statement changes records--the following changes three recordspay column values to (bobannand kim)because their pay was no more than $ , as usualthe cursor' rowcount gives the number of records changedcurs execute('update people set pay=where pay <?'[ ]curs rowcount curs execute('select from people'curs fetchall([('bob''dev' )('sue''mus' )('ann''mus' )('tom''mgr' )('kim''adm' )('pat''dev' )the sql delete statement removes recordsoptionally according to condition (to delete all recordsomit the conditionin the followingwe delete bob' recordas well as any record with pay that is at least $ , databases and persistence |
5,903 | curs execute('delete from people where pay >?',( ,)curs execute('select from people'curs fetchall([('sue''mus' )('ann''mus' )('kim''adm' )conn commit(finallyremember to commit your changes to the database before exiting pythonassuming you wish to keep them without commita connection rollback or close callas well as the connection' __del__ deletion methodwill back out uncommitted changes connection objects are automatically closed if they are still open when they are garbage collectedwhich in turn triggers __del__ and rollbackgarbage collection happens automatically on program exitif not sooner building record dictionaries now that we've seen the basics in actionlet' move on and apply them to few larger tasks the sql api defines query results to be sequences of sequences one of the more common features that people seem to miss from the api is the ability to get records back as something more structured-- dictionary or class instancefor examplewith keys or attributes giving column names the orms we'll meet at the end of this map to class instancesbut because this is pythonit' easy to code this kind of transformation in other ways moreoverthe api already gives us the tools we need using table descriptions for exampleafter query execute callthe db api specifies that the cursor' description attribute gives the names and (for some databasestypes of the columns in the result table to see howlet' continue with the database in the state in which we left it in the prior sectioncurs execute('select from people'curs description (('name'nonenonenonenonenonenone)('job'nonenonenonenonenonenone)('pay'nonenonenonenonenonenone)curs fetchall([('sue''mus' )('ann''mus' )('kim''adm' )formallythe description is sequence of column-description sequenceseach of the following form see the db api for more on the meaning of the type code slot--it maps to objects at the top level of the database interface modulebut the sqlite module implements only the name component(nametype_codedisplay_sizeinternal_sizeprecisionscalenull_oknowwe can use this metadata anytime we want to label the columns--for instancein formatted records display (be sure to regenerate query result firstsince the prior result has been fetched)sql database interfaces |
5,904 | colnames [desc[ for desc in curs descriptioncolnames ['name''job''pay'for row in curs fetchall()for namevalue in zip(colnamesrow)print(name'\ =>'valueprint(name =sue job =mus pay = name job pay =ann =mus = name job pay =kim =adm = notice how tab character is used to try to make this output aligna better approach might be to determine the maximum field name length (we'll see how in later examplerecord dictionaries construction it' minor extension of our formatted display code to create dictionary for each recordwith field names for keys--we just need to fill in the dictionary as we gocurs execute('select from people'colnames [desc[ for desc in curs descriptionrowdicts [for row in curs fetchall()newdict {for nameval in zip(colnamesrow)newdict[nameval rowdicts append(newdictfor row in rowdictsprint(row{'pay' 'job''mus''name''sue'{'pay' 'job''mus''name''ann'{'pay' 'job''adm''name''kim'because this is pythonthoughthere are more powerful ways to build up these record dictionaries for instancethe dictionary constructor call accepts the zipped namevalue pairs to fill out the dictionaries for uscurs execute('select from people'colnames [desc[ for desc in curs descriptionrowdicts [for row in curs fetchall()rowdicts appenddict(zip(colnamesrow) databases and persistence |
5,905 | rowdicts[ {'pay' 'job''mus''name''sue'and finallya list comprehension will do the job of collecting the dictionaries into list--not only is this less to typebut it probably runs quicker than the original versioncurs execute('select from people'colnames [desc[ for desc in curs descriptionrowdicts [dict(zip(colnamesrow)for row in curs fetchall()rowdicts[ {'pay' 'job''mus''name''sue'one of the things we lose when moving to dictionaries is record field order--if you look back at the raw result of fetchallyou'll notice that record fields are in the namejoband pay order in which they were stored our dictionary' fields come back in the pseudorandom order of python mappings as long as we fetch fields by keythis is irrelevant to our script tables still maintain their orderand dictionary construction works fine because the description result tuple is in the same order as the fields in row tuples returned by queries we'll leave the task of translating record tuples into class instances as suggested exerciseexcept for two hintspython' standard library collections module implements more exotic data typessuch as named tuples and ordered dictionariesand we can access fields as attributes rather than as keysby simply creating an empty class instance and assigning to attributes with the python setattr function classes would also provide natural place to code inheritable tools such as standard display methods in factthis is part of the utility that the upcoming orms can provide for us automating with scripts and modules up to this pointwe've essentially used python as command-line sql client--our queries have been typed and run interactively all the kinds of code we've runthoughcan be used as the basis of database access in script files working interactively requires retyping things such as multiline loopswhich can become tedious with scriptswe can automate our work to demonstratelet' make the last section' prior example into utility module-example - is reusable module that knows how to translate the result of query from row tuples to row dictionaries example - pp \dbase\sql\makedicts py ""convert list of row tuples to list of row dicts with field name keys this is not command-line utilityhardcoded self-test if run ""def makedicts(cursorqueryparams=())cursor execute(queryparamscolnames [desc[ for desc in cursor descriptionsql database interfaces |
5,906 | return rowdicts if __name__ ='__main__'self test import sqlite conn sqlite connect('dbase 'cursor conn cursor(query 'select namepay from people where pay ?lowpay makedicts(cursorquery[ ]for rec in lowpayprint(recas usualwe can run this file from the system command line as script to invoke its self-test code\pp \dbase\sqlmakedicts py {'pay' 'name''ann'{'pay' 'name''kim'or we can import it as module and call its function from another contextlike the interactive prompt because it is moduleit has become reusable database tool\pp \dbase\sqlpython from makedicts import makedicts from sqlite import connect conn connect('dbase 'curs conn cursor(curs execute('select from people'curs fetchall([('sue''mus' )('ann''mus' )('kim''adm' )rows makedicts(curs"select name from people where job 'mus'"rows [{'name''sue'}{'name''ann'}our utility handles arbitrarily complex queries--they are simply passed through the db api to the database server or library the order by clause here sorts the result on the name fieldquery 'select namepay from people where job order by namemusicians makedicts(cursquery['mus']for row in musiciansprint(row{'pay' 'name''ann'{'pay' 'name''sue'tying the pieces together so farwe've learned how to make databases and tablesinsert records into tablesquery table contentsand extract column names for referenceand to show how these techniques are combinedexample - collects them into single script databases and persistence |
5,907 | from sqlite import connect conn connect('dbase 'curs conn cursor(trycurs execute('drop table people'exceptpass did not exist curs execute('create table people (name char( )job char( )pay int( ))'curs execute('insert into people values (???)'('bob''dev' )curs execute('insert into people values (???)'('sue''dev' )curs execute('select from people'for row in curs fetchall()print(rowcurs execute('select from people'colnames [desc[ for desc in curs descriptionwhile trueprint('- row curs fetchone(if not rowbreak for (namevaluein zip(colnamesrow)print('% =% (namevalue)conn commit(save inserted records refer to prior sections in this tutorial if any of the code in this script is unclear when runit creates two-record database and lists its content to the standard output streamc:\pp \dbase\sqltestdb py ('bob''dev' ('sue''dev' name =bob job =dev pay = name =sue job =dev pay = as isthis example is really just meant to demonstrate the database api it hardcodes database namesand it re-creates the database from scratch each time we could turn this code into generally useful tools by refactoring it into reusable partsas we'll see later in this section firstthoughlet' explore techniques for getting data into our databases sql database interfaces |
5,908 | one of the nice things about using python in the database domain is that you can combine the power of the sql query language with the power of the python generalpurpose programming language they naturally complement each other loading with sql and python supposefor examplethat you want to load database table from flat filewhere each line in the file represents database rowwith individual field values separated by commas examples - and - list two such datafiles we're going to be using here example - pp \dbase\sql\data txt bob,devel, sue,music, ann,devel, tim,admin, kim,devel, example - pp \dbase\sql\data txt bob,developer, sue,music, ann,manager, nowsome database systems like mysql have handy sql statement for loading such table quickly its load data statement parses and loads data from text filelocated on either the client or the server machine in the followingthe first command deletes all records in the tableand we're using the fact that python automatically concatenates adjacent string literals to split the sql statement over multiple linesusing mysql (currently available for python onlylog into mysql first curs execute('delete from people'all records curs execute"load data local infile 'data txt"into table people fields terminated by ','"curs execute('select from people'for row in curs fetchall()print(row('bob''devel' ('sue''music' long integers ('ann''devel' ('tim''admin' ('kim''devel' lconn commit(this works as expected but what if you must use system like the sqlite database used in this bookwhich lacks this specific sql statementorperhaps you just need to do something more custom than this mysql statement allows not to worry-- databases and persistence |
5,909 | and python (againsome irrelevant output lines are omitted here) :\pp \dbase\sqlpython from sqlite import connect conn connect('dbase 'curs conn cursor(curs execute('delete from people'curs execute('select from people'curs fetchall([empty the table file open('data txt'rows [line rstrip(split(','for line in filerows[ ['bob''devel'' 'for rec in rowscurs execute('insert into people values (???)'reccurs execute('select from people'for rec in curs fetchall()print(rec('bob''devel' ('sue''music' ('ann''devel' ('tim''admin' ('kim''devel' this code makes use of list comprehension to collect string split results for all lines in the file after removing any newline charactersand file iterators to step through the file line by line its python loop does the same work as the mysql load statementand it will work on more database typesincluding sqlite we can get some similar result from an executemany db api call shown earlier as wellbut the python for loop here has the potential to be more general python versus sql in factyou have the entire python language at your disposal for processing database resultsand little python can often duplicate or go beyond sql syntax for instancesql has special aggregate function syntax for computing things such as sums and averagescurs execute("select sum(pay)avg(payfrom people where job 'devel'"curs fetchall([( )by shifting the processing to pythonwe can sometimes simplify and do more than sql' syntax allows (albeit potentially sacrificing any query performance optimizations the database may performcomputing pay sums and averages with python can be accomplished with simple loopsql database interfaces |
5,910 | result curs fetchall(result (('bob' )('ann' )('kim' )tot for (namepayin resulttot +pay print('total:'tot'average:'tot len(result)total average use /to truncate or we can use more advanced tools such as comprehensions and generator expressions to calculate sumsaveragesmaximumsand the likeprint(sum(rec[ for rec in result)generator expr print(sum(rec[ for rec in resultlen(result) print(max(rec[ for rec in result) the python approach is more generalbut it doesn' buy us much until things become more complex for examplehere are few more advanced comprehensions that collect the names of people whose pay is above and below the average in the query result setavg sum(rec[ for rec in resultlen(resultprint([rec[ for rec in result if rec[ avg]['kim'print([rec[ for rec in result if rec[ avg]['ann'we may be able to do some of these kinds of tasks with more advanced sql techniques such as nested queriesbut we eventually reach complexity threshold where python' general-purpose nature makes it attractive and potentially more portable for comparisonhere is the equivalent sqlquery ("select name from people where job 'develand "pay (select avg(payfrom people where job 'devel')"curs execute(querycurs fetchall([('kim',)query ("select name from people where job 'develand "pay (select avg(payfrom people where job 'devel')"curs execute(querycurs fetchall([('ann',)this isn' the most complex sql you're likely to meetbut beyond this pointsql can become more involved moreoverunlike pythonsql is limited to database-specific tasks by design imagine query that compares column' values to data fetched off the web or from user in gui--simple with python' internet and gui supportbut well beyond the scope of special-purpose language such as sql by combining python and sqlyou get the best of both and can choose which is best suited to your goals databases and persistence |
5,911 | set is arbitrarily extensible with functionsmodulesand classes to illustratehere are some of the same operations coded in more mnemonic fashion with the dictionaryrecord module we wrote earlierfrom makedicts import makedicts recs makedicts(curs"select from people where job 'devel'"print(len(recs)recs[ ] {'pay' 'job''devel''name''bob'print([rec['name'for rec in recs]['bob''ann''kim'print(sum(rec['pay'for rec in recs) avg sum(rec['pay'for rec in recslen(recsprint([rec['name'for rec in recs if rec['pay'avg]['kim'print([rec['name'for rec in recs if rec['pay'>avg]['bob''kim'similarlypython' set object type provides operations such as intersectionunionand difference which can serve as alternatives to other sql database operations (in the interest of spacewe'll leave their exploration as suggested side tripfor more advanced database extensionssee the sql-related tools available for python in the third-party domain for examplea variety of packages add an oop flavor to the db api--the orms we'll explore near the end of this sql utility scripts at this point in our sql db api tourwe've started to stretch the interactive prompt to its breaking point--we wind up retyping the same boilerplate code again every time we start session and every time we run test moreoverthe code we're writing is substantial enough to be reused in other programs let' wrap up by transforming our code into reusable scripts that automate tasks and support reuse to illustrate more of the power of the python/sql mixthis section presents handful of utility scripts that perform common tasks--the sorts of things you' otherwise have to recode often during development as an added bonusmost of these files are both command-line utilities and modules of functions that can be imported and called from other programs most of the scripts in this section also allow database file name to be passed in on the command linethis allows us to use different databases for different purposes during development--changes in one won' impact others table load scripts let' take quick look at code firstbefore seeing it in actionfeel free to skip ahead to correlate the code here with its behavior as first (and less than idealstepexample - shows simple way to script-ify the table-loading logic of the prior section sql database interfaces |
5,912 | ""load table from comma-delimited text fileequivalent to this nonportable sqlload data local infile 'data txtinto table people fields terminated by ','""import sqlite conn sqlite connect('dbase 'curs conn cursor(file open('data txt'rows [line rstrip(split(','for line in filefor rec in rowscurs execute('insert into people values (???)'recconn commit(conn close(commit changes nowif db supports transactions close__del__ call rollback if changes not committed yet as isexample - is top-level script geared toward one particular case it' hardly any extra work to generalize this into function that can be imported and used in variety of scenariosas in example - -- much more widely useful module and command-line script example - pp \dbase\sql\loaddb py ""load table from comma-delimited text filereusable/generalized version importable functionscommand-line usageloaddb py dbfiledatafiletable""def login(dbfile)import sqlite conn sqlite connect(dbfilecurs conn cursor(return conncurs create or open db file def loaddb(curstabledatafileconn=noneverbose=true)file open(datafilex, , \nx, , \ rows [line rstrip(split(','for line in file[[ , , ][ , , ]rows [str(tuple(rec)for rec in rows["( , , )""( , , )"for recstr in rowscurs execute('insert into table values recstrif connconn commit(if verboseprint(len(rows)'rows loaded'if __name__ ='__main__'import sys dbfiledatafiletable 'dbase ''data txt''peopleif len(sys argv dbfile sys argv[ if len(sys argv datafile sys argv[ if len(sys argv table sys argv[ conncurs login(dbfileloaddb(curstabledatafileconn databases and persistence |
5,913 | for the insert statement (see its comments for the transforms appliedwe could also use an executemany call as we did earlierbut we want to be general and avoid hardcoding the fields insertion template--this function might be used for tables with any number of columns this file also defines login function to automate the initial connection calls--after retyping this four-command sequence enough timesit seemed prime candidate for function in additionthis reduces code redundancyin the futuresuch logic need only be changed in single location if we change database systemsas long as the login function is used everywhere table display script once we load datawe probably will want to display it example - allows us to display results as we go--it prints an entire table with either simple display (which could be parsed by other toolsor formatted display (generated with the dictionaryrecord utility we wrote earliernotice how it computes the maximum field-name size for alignment with generator expressionthe size is passed in to string formatting expression by specifying an asterisk (*for the field size in the format string example - pp \dbase\sql\dumpdb py ""display table contents as raw tuplesor formatted with field names command-line usagedumpdb py dbnametable[-(dash=formatted display""def showformat(recssept=('- ))print(len(recs)'records'print(septfor rec in recsmaxkey max(len(keyfor key in recfor key in recprint('%-* =% (maxkeykeyrec[key])print(septmax key len or\ align -ljust*len def dumpdb(cursortableformat=true)if not formatcursor execute('select from tablewhile truerec cursor fetchone(if not recbreak print(recelsefrom makedicts import makedicts recs makedicts(cursor'select from tableshowformat(recsif __name__ ='__main__'import sys dbnameformattable 'dbase 'false'peoplesql database interfaces |
5,914 | if '-in cmdargsformat true cmdargs remove('-'if cmdargsdbname cmdargs pop( if cmdargstable cmdargs[ format if '-in cmdline args dbname if other cmdline arg from loaddb import login conncurs login(dbnamedumpdb(curstableformatwhile we're at itlet' code some utility scripts to initialize and erase the databaseso we do not have to type these by hand at the interactive prompt again every time we want to start from scratch example - completely deletes and re-creates the databaseto reset it to an initial state (we did this manually at the start of the tutorialexample - pp \dbase\sql\makedb py ""physically delete and re-create database files usagemakedb py dbnametablename""import sys if input('are you sure?'lower(not in (' ''yes')sys exit(dbname (len(sys argv and sys argv[ ]or 'dbase table (len(sys argv and sys argv[ ]or 'peoplefrom loaddb import login conncurs login(dbnametrycurs execute('drop table tableexceptprint('database table did not exist'command 'create table % (name char( )job char( )pay int( ))table curs execute(commandconn commit(commit may be optional here print('made'dbnametablenextthe clear script in example - deletes all rows in the tableinstead of dropping and re-creating them entirely for testing purposeseither approach is sufficient minor caveatthe rowcount attribute doesn' always reflect the number of rows deleted in sqlitesee its library manual entry for details example - pp \dbase\sql\cleardb py ""delete all rows in tablebut don' drop the table or database it is in usagecleardb py dbnametablename"" databases and persistence |
5,915 | if input('are you sure?'lower(not in (' ''yes')sys exit(dbname sys argv[ if len(sys argv else 'dbase table sys argv[ if len(sys argv else 'peoplefrom loaddb import login conncurs login(dbnamecurs execute('delete from table#print(curs rowcount'records deleted'conn commit(conn closed by its __del__ else rows not really deleted finallyexample - provides command-line tool that runs query and prints its result table in formatted style there' not much to this scriptbecause we've automated most of its tasks alreadythis is largely just combination of existing tools such is the power of code reuse in python example - pp \dbase\sql\querydb py ""run query stringdisplay formatted result table examplequerydb py dbase "select namejob from people where pay ""import sys databasequerystr 'dbase ''select from peopleif len(sys argv database sys argv[ if len(sys argv querystr sys argv[ from makedicts import makedicts from dumpdb import showformat from loaddb import login conncurs login(databaserows makedicts(cursquerystrshowformat(rowsusing the scripts last but not leasthere is log of session that makes use of these scripts in commandline modeto illustrate their operation most of the files also have functions that can be imported and called from different programthe scripts simply map commandline arguments to the functionsarguments when run standalone the first thing we do is initialize testing database and load its table from text file\pp \dbase\sqlmakedb py testdb are you sure? database table did not exist made testdb people \pp \dbase\sqlloaddb py testdb data txt rows loaded sql database interfaces |
5,916 | display)\pp \dbase\sqldumpdb py testdb ('bob''developer' ('sue''music' ('ann''manager' \pp \dbase\sqldumpdb py testdb records pay = job =developer name =bob pay = job =music name =sue pay = job =manager name =ann the dump script is an exhaustive displayto be more specific about which records to viewuse the query script and pass in query string on the command line (the command lines are split here to fit in this book)\pp \dbase\sqlquerydb py testdb "select namejob from people where pay records job =developer name =bob job =manager name =ann \pp \dbase\sqlquerydb py testdb "select from people where name 'sue' records pay = job =music name =sue nowlet' erase and start again with new data set file the clear script erases all records but doesn' reinitialize the database completely\pp \dbase\sqlcleardb py testdb are you sure? \pp \dbase\sqldumpdb py testdb records databases and persistence |
5,917 | rows loaded \pp \dbase\sqldumpdb py testdb ('bob''devel' ('sue''music' ('ann''devel' ('tim''admin' ('kim''devel' in closinghere are three queries in action on this new data setthey fetch names of developersjobs that pay above an amountand records with given pay level sorted by job we could run these at the python interactive promptof coursebut we're getting lot of setup and boilerplate code for free here\pp \dbase\sqlquerydb py testdb "select name from people where job 'devel' records name =bob name =ann name =kim \pp \dbase\sqlquerydb py testdb "select job from people where pay > records job =music job =devel \pp \dbase\sqlquerydb py testdb "select from people where pay > order by job records pay = job =devel name =kim pay = job =music name =sue before we move onsome contextthe scripts in this section illustrate the benefits of code reuseaccomplish their purpose (which was partly demonstrating the sql api)and serve as model for canned database utilities but they are still not as general as they could besupport for sorting options in the dump scriptfor examplemay be sql database interfaces |
5,918 | we may need to revert to typing sql commands in client--part of the reason sql is language is because it must support so much generality further extensions to these scripts are left as exercises change this code as you likeit' pythonafter all sql resources although the examples we've seen in this section are simpletheir techniques scale up to much more realistic databases and contexts the websites we studied in the prior part of the bookfor instancecan make use of sql-based systems such as mysql to store page state information as well as long-lived client information because mysql (among otherssupports both large databases and concurrent updatesit' natural for website implementation there is more to database interfaces than we've seenbut additional api documentation is readily available on the web to find the full database api specificationsearch the web for "python database api you'll find the formal api definition--really just text file describing the pep (the python enhancement proposalunder which the api was hashed out perhaps the best resource for additional information about database extensions today is the home page of the python database sig go to the community and sigs links thereand navigate to the database group' pageor run search thereyou'll find api documentation (this is where it is officially maintained)links to database vendor-specific extension modulesand more and as alwayssee the pypi website and search the web at large for related third-party tools and extensions ormsobject relational mappers in this we've seen oodbs that store native python objects persistentlyas well as sql databases that store information in tables it turns out that there is another class of system that attempts to bridge the object and table worldswhich 've hinted at earlier in this orms graft the python class model onto the tables of relational databases they combine the power of relational database systems with the simplicity of python class-based syntax--you don' need to forgo sql-based databasesbut you can still store data that seems like python objects to your scripts todaythere are two leading open source third-party systems that implement this mappingsqlobject and sqlalchemy both are fairly complex systems that we cannot do full justice to in this textand you're best off researching their documentation on the web for the full story (there are also dedicated books covering sqlalchemy todaymoreoverneither is completely python ready as write these wordsso we can' run live examples with them in this text databases and persistence |
5,919 | quick look at how you might use it to create and process database records in the sqlobject system in briefsqlobject mapspython classes to database tables python class instances to rows in the table python instance attributes to row columns for exampleto create tablewe define it with classwith class attributes that define columnsand call its creation method (this code is derived from more complete example at sqlobject' website)from sqlobject import sqlhub processconnection connectionforuri('sqlite:/:memory:'class person(sqlobject)classdescribes table first stringcol(class attributesrow columns mid stringcol(length= default=nonelast stringcol(person createtable(create database table once createdmaking an instance automatically inserts row into the databaseand attribute fetches and assignments are automatically mapped to fetches and updates of the corresponding table row' columnp person(first='bob'last='smith' new instancemakes and inserts row prints all attributes by name first mid 'mattributefetches row column attributeupdates record existing rows/instances may be fetched by methods callsand we can assign multiple columns/attributes with single update operationp person get( set(first='tom'last='jones'fetch existing record/instancep is update two attributes/fields at once in additionwe can select by column values by creating query object and executing itts person select(person first=='tom'queryselect by column value list(tsrun the querylist of instances tjs person selectby(first='tom'last='jones'alternative query form (andnaturallythis barely scratches the surface of the available functionality even at this level of complexitythoughthis is quite trick--sqlobject automatically issues all the sql required to fetchstoreand query the table and rows implied by the python class syntax here againthe net effect allows systems to leverage the power of enterprise-level relational databasesbut still use familiar python class syntax to process stored data in python scripts ormsobject relational mappers |
5,920 | is functionally similar for more details on orms for pythonconsult your friendly neighborhood web search engine you can also learn more about such systems by their roles in some larger web development frameworksdjangofor instancehas an orm which is another variation on this theme pyforma persistent object viewer (externalinstead of going into additional database interface details that are freely available on the webi' going to close out this by directing you to supplemental example that shows one way to combine the gui technology we met earlier in the text with the persistence techniques introduced in this this example is named pyform-- python/tkinter gui designed to let you browse and edit tables of recordstables browsed may be shelvesdbm filesin-memory dictionariesor any other object that looks and feels like dictionary records within tables browsed can be class instancessimple dictionariesstringsor any other object that can be translated to and from dictionary although this example is about guis and persistenceit also illustrates python design techniques to keep its implementation both simple and type-independentthe pyform gui is coded to expect tables to look like dictionaries of dictionaries to support variety of table and record typespyform relies on separate wrapper classes to translate tables and records to the expected protocolat the top table levelthe translation is easy--shelvesdbm filesand in-memory dictionaries all have the same key-based interface at the nested record levelthe gui is coded to assume that stored items have dictionary-like interfacetoobut classes intercept dictionary operations to make records compatible with the pyform protocol records stored as strings are converted to and from the dictionary objects on fetches and storesrecords stored as class instances are translated to and from attribute dictionaries more specialized translations can be added in new table wrapper classes the net effect is that pyform can be used to browse and edit wide variety of table typesdespite its dictionary interface expectations when pyform browses shelves and dbm filestable changes made within the gui are persistent--they are saved in the underlying files when used to browse shelve of class instancespyform essentially becomes gui frontend to simple object database that is built using standard python persistence tools to view and update shelve of objects with pyformfor examplecode like the following will sufficeimport shelve from formgui import formgui db shelve open(/data/castfile'formgui(dbmainloop( databases and persistence after initcast reopen shelve file browse existing shelve-of-dicts |
5,921 | like thisfrom pp dbase testdata inport actor from formgui import formgui from formtable import shelveofinstance testfile /data/shelvetable shelveofinstance(testfileactorformgui(tablemainloop(table close(run in tablebrowser dir external filename wrap shelf in table object close needed for some dbm figure - captures the scene under python and windows when viewing shelve of persistent class instance objects this pyform session was kicked off by commandline described in its form table module' self-test codeformtable py shelve and omit the (or pass it as to avoid reinitializing the shelve at the start of each session so changes are retained pyform' gui can also be started from the pydemos launcher we met in though it does not save changes persistently in this mode run the example on your own computer to get better sample of its operation though not fully general python persistent object table viewerpyform serves as simple object database front end figure - pyform displaying shelve of actor objects because we are short on time and space in this editioni' going to omit both the source code for this example and its description here to study pyformsee the following directory in the book' examples package distribution described in the prefacec:\pp \dbase\tablebrowser pyforma persistent object viewer (external |
5,922 | pyform overview material from the third edition in pdf file pyform' source code files are ported to python formthough code in the overview document still shows its third edition roots for the purposes of the published portions of this booklet' move on to the next and our next tools topicdata structure implementations databases and persistence |
5,923 | data structures "roses are redviolets are bluelists are mutableand so is set foodata structures are central theme in most programseven if python programmers often don' need to care their apathy is warranted--python comes "out of the boxwith rich set of built-in and already optimized types that make it easy to deal with structured datalistsstringstuplesdictionariessetsand the like for simple systemsthese types are usually enough dictionariesfor examplesubsume many of the classical searching algorithmsand lists replace much of the work you' do to support collections in lower-level languages even soboth are so easy to use that you generally never give them second thought but for advanced applicationswe may need to add more sophisticated types of our own to handle extra requirements or special cases in this we'll explore handful of advanced data structure implementations in pythonsetsstacksgraphsand so on as we'll seedata structures can take the form of new object types in pythonintegrated into the language' type model that isobjects we code in python become full-fledged datatypes--to the scripts that use themthey can look and feel just like built-in listsnumbersand dictionaries although the examples in this illustrate advanced programming and computer science techniquesthey also underscore python' support for writing reusable software as object implementations are coded with classes and modulesthey naturally become generally useful components that can be used in any program that imports them in effectwe will be building libraries of data structure toolswhether we plan for it or not moreoverthough most examples in this are pure python code (and at least to linear readerssome may seem relatively simple compared to those of earlier )they also provide use case for discussing python performance issuesand hint at what' possible with the subject of --from the most general perspectivenew |
5,924 | types coded in use patterns similar to those here in the endthoughwe'll also see that python' built-in support can often take the place of homegrown solutions in this domain although custom data structure implementations are sometimes necessary and still have much to offer in terms of code maintenance and evolutionthey are not always as paramount in python as they are in less programmer-friendly languages implementing stacks stacks are common and straightforward data structureused in variety of applicationslanguage processinggraph searchesand so on for instanceexpression evaluation in the next calculator gui is largely an exercise in juggling stacksand programming languages in general typically implement function calls as stack operations in order to remember what to resume as calls return stacks can also help in xml parsingthey are natural for tracking progress any time constructs might be arbitrarily nested in shortstacks are last-in-first-out collection of objects--the last item added to the collection is always the next one to be removed unlike the queues we used for thread communicationwhich add and delete at opposite endsall the activity in stacks happens at the top clients use stacks bypushing items onto the top popping items off the top depending on client requirementsthere may also be tools for such tasks as testing whether the stack is emptyfetching the top item without popping ititerating over stack' itemstesting for item membershipand so on built-in options in pythona simple list is often adequate for implementing stackbecause we can change lists in place arbitrarilywe can add and delete items from either the beginning (leftor the end (righttable - summarizes various built-in operations available for implementing stack-like behavior with python lists and in-place changes they vary depending on whether the stack "topis the first or the last node in the listin this tablethe string 'bis the initial top item on the stack table - stacks as lists operation top is end-of-list top is front-of-list top is front-of-list new stack=[' '' 'stack=[' '' 'stack=[' '' 'push stack append(' 'stack insert( ,' 'stack[ : ]=[' ' data structures |
5,925 | top is end-of-list top is front-of-list top is front-of-list pop top stack[- ]top stack[ ]top stack[ ]del stack[- del stack[ stack[: [even more convenientlypython grew list pop method later in its life designed to be used in conjunction with append to implement stacks and other common structures such as queuesyielding the even simpler coding options listed in table - table - stacks as listscoding alternatives operation top is end-of-list top is front-of-list new stack=[' '' 'stack=[' '' 'push stack append(' 'stack insert( ,' 'pop top stack pop(top stack pop( by defaultpop is equivalent to fetchingand then deletingthe last item at offset - with an argumentpop deletes and returns the item at that offset--list pop(- is the same as list pop(for in-place change operations like appendinsertdeland popno new list is created in memoryso execution is quick (performance may further depend upon which end is the "top,but this in turn depends on python' current list implementationas well as measurement concepts we'll explore laterqueues can be coded similarlybut they pop at the other end of the list other built-in coding schemes are possible as well for instancedel stack[: is yet another way to delete the first item in list-based stack in-place depending on which end is interpreted as the top of the stackthe following sequence assignment statement forms can be used to fetch and remove the top item as wellalbeit at the cost of making new list object each timetop is front of list topstack stack[ ]stack[ :top*stack stack python xpython top is end of list stacktop stack[:- ]stack[- *stacktop stack python xpython with so many built-in stack optionswhy bother implementing othersfor one thingthey serve as simple and familiar context for exploring data structure concepts in this book more importantlythoughthere is practical programming motive here list representations work and will be relatively fastbut they also bind stack-based programs to the stack representation chosenall stack operations will be hardcoded throughout program if we later want to change how stack is represented or extend its basic operation setwe're stuck--every stack-based program will have to be updatedand in every place where it accesses the stack implementing stacks |
5,926 | performswe' have to add code around each hardcoded stack operation in large systemthis update may be nontrivial work as we'll discuss in we may also decide to move stacks to -based implementationif they prove to be performance bottleneck as general rulehardcoded operations on built-in data structures don' support future migrations as well as we' sometimes like as we'll see laterbuilt-in types such as lists are actually class-like objects in python that we can subclass to customizetoo this might be only partial fixthoughunless we anticipate future changes and make instances of subclasswe may still have maintenance issue if we use built-in list operations directly and ever want to extend what they do in the future stack module of coursethe real solution to such code maintenance dilemmas is to encapsulate-that iswrap up--stack implementations behind interfacesusing python' code reuse tools as long as clients stick to using the interfaceswe're free to change the interfacesimplementations arbitrarily without having to change every place they are called let' begin by implementing stack as module containing python listplus functions to operate on itexample - shows one way to code this example - pp \dstruct\basic\stack py " shared stack modulestack [class error(exception)pass on first import local excsstack error def push(obj)global stack stack [objstack use 'globalto change add item to the front def pop()global stack if not stackraise error('stack underflow'top*stack stack return top def top()if not stackraise error('stack underflow'return stack[ raise local error remove item at front raise local error or let indexerror occur def empty()return not stack is the stack []def member(obj)return obj in stack item in stackdef item(offset)return stack[offsetindex the stack def length()return len(stacknumber entries def dump()print('stack data structures |
5,927 | the stack is declared global in functions that change itbut not in those that just reference it the module also defines an error object (errorthat can be used to catch exceptions raised locally in this module some stack errors are built-in exceptionsthe method item triggers indexerror for out-of-bounds indexes most of the stack' functions just delegate the operation to the embedded list used to represent the stack in factthe module is really just simple wrapper around python list because this extra layer of interface logic makes clients independent of the actual implementation of the stackthoughwe're able to change the stack later without impacting its clients as usualone of the best ways to understand such code is to see it in action here' an interactive session that illustrates the module' interfaces--it implements stack of arbitrary python objectsc:\pp \dstruct\basicpython import stack stack push('spam'stack push( stack top( stack stack [ 'spam'stack pop( stack dump(stack pop('spamstack empty(true for in 'spam'stack push(cwhile not stack empty()print(stack pop()end=' stack pop(stack errorstack underflow other operations are analogousbut the main thing to notice here is that all stack operations are module functions for instanceit' possible to iterate over the stackbut we need to use counter-loops and indexing function calls (itemnothing is preventing clients from accessing (and even changingstack stack directlybut doing so defeats the purpose of interfaces like this onefor in 'spam'stack push(cstack dump(for in range(stack length())implementing stacks |
5,928 | print(stack item( )end=' stack class perhaps the biggest drawback of the module-based stack is that it supports only single stack object all clients of the stack module effectively share the same stack sometimes we want this featurea stack can serve as shared-memory object for multiple modules but to implement true stack datatype that can generate independent objectswe need to use classes to illustratelet' define full-featured stack class the stack class shown in example - defines new datatype with variety of behaviors like the modulethe class uses python list to hold stacked objects but this timeeach instance gets its own list the class defines both "realmethods and specially named methods that implement common type operations comments in the code describe special methods example - pp \dstruct\basic\stack py " multi-instance stack classclass error(exception)pass class stackdef __init__(selfstart=[])self stack [for in startself push(xself reverse(def push(selfobj)self stack [objself stack when importedlocal exception self is the instance object start is any sequencestack undo push' order reversal methodslike module self top is front of list def pop(self)if not self stackraise error('underflow'top*self stack self stack return top def top(self)if not self stackraise error('underflow'return self stack[ def empty(self)return not self stack instance empty(overloads def __repr__(self)return '[stack:% ]self stack printrepr()def __eq__(selfother)return self stack =other stack '==''!='def __len__(self) data structures |
5,929 | len(instance)not instance def __add__(selfother)return stack(self stack other stackinstance instance def __mul__(selfreps)return stack(self stack repsinstance reps def __getitem__(selfoffset)return self stack[offsetsee also __iter__ instance[ ][ : ]infor def __getattr__(selfname)return getattr(self stacknameinstance sort()/reverse()now distinct instances are created by calling the stack class like function in most respectsthe stack class implements operations exactly like the stack module in example - but hereaccess to the stack is qualified by selfthe subject instance object each instance has its own stack attributewhich refers to the instance' own list furthermoreinstance stacks are created and initialized in the __init__ constructor methodnot when the module is first imported let' make couple of stacks to see how all this works in practicefrom stack import stack stack( push('spam' push( [stack:[ 'spam']make stack objectpush items __repr__ prints stack stack(two distinct stack objects push( they do not share content push( pop()xy ([stack:['spam']][stack:[ ]] stack(for in 'spam' push(cwhile zprint( pop()end=' [stack:['spam' ]for item in zprint(itemend='spam reverse( [stack:[ 'spam']third distinct stack object __len__ tests stack truth __add__ handles stack holds three different types __getitem__ does for __getattr__ delegates to list implementing stacks |
5,930 | instances by attribute references and expressions additionallyit defines the __get attr__ special method to intercept references to attributes not defined in the class and to route them to the wrapped list object (to support list methodssortappendreverseand so onmany of the module' operations become operators in the class table - shows the equivalence of module and class operations (columns and and gives the class method that comes into play for each (column table - module/class operation comparison module operations class operations class method module empty(not instance __len__ module member(xx in instance __getitem__ module item(iinstance[i__getitem__ module length(len(instance__len__ module dump(print(instance__repr__ range(counter loops for in instance __getitem__ manual loop logic instance instance __add__ module stack reverse(instance reverse(__getattr__ module push/pop/top instance push/pop/top push/pop/top in effectclasses let us extend python' set of built-in types with reusable types implemented in python modules class-based types may be used just like built-in typesdepending on which operation methods they defineclasses can implement numbersmappingsand sequencesand may or may not be mutable class-based types may also fall somewhere in between these categories customizationperformance monitors so far we've seen how classes support multiple instances and integrate better with python' object model by defining operator methods one of the other main reasons for using classes is to allow for future extensions and customizations by implementing stacks with classwe can later add subclasses that specialize the implementation for new demands in factthis is often the main reason for using custom class instead of built-in alternative for instancesuppose we've started using the stack class in example - but we start running into performance problems one way to isolate bottlenecks is to instrument data structures with logic that keeps track of usage statisticswhich we can analyze after running client applications because stack is classwe can add such logic in new subclass without affecting the original stack module (or its clientsthe subclass in example - extends stack to keep track of overall push/pop usage frequencies and to record the maximum size of each instance data structures |
5,931 | "customize stack for usage datafrom stack import stack extends imported stack class stacklog(stack)pushes pops def __init__(selfstart=[])self maxlen stack __init__(selfstartcount pushes/popsmax-size shared/static class members could also be module vars def push(selfobject)stack push(selfobjectstacklog pushes + self maxlen max(self maxlenlen(self)do real push overall stats per-instance stats def pop(self)stacklog pops + return stack pop(selfoverall counts not 'self pops'instance def stats(self)return self maxlenself pushesself pops get counts from instance this subclass works the same as the original stackit just adds monitoring logic the new stats method is used to get statistics tuple through an instancefrom stacklog import stacklog stacklog( stacklog(make two stack objects for in range( ) push(iand push object on them for in 'spam' push(cxy run inherited __repr__ ([stack:[ ]][stack:[' '' '' '' ']] stats() stats((( )( ) pop() pop((' ' stats() stats(my maxlenall pushesall pops (( )( )notice the use of class attributes to record overall pushes and popsand instance attributes for per-instance maximum length by hanging attributes on different objectswe can expand or narrow their scopes optimizationtuple tree stacks one of the nice things about wrapping objects up in classes is that you are free to change the underlying implementation without breaking the rest of your program optimizations can be added in the futurefor instancewith minimal impactthe interface is implementing stacks |
5,932 | some more efficient than others so farour stacks have used slicing and extended sequence assignment to implement pushing and popping this method is relatively inefficientboth operations make copies of the wrapped list object for large stacksthis practice can add significant time penalty one way to speed up such code is to change the underlying data structure completely for examplewe can store the stacked objects in binary tree of tupleseach item may be recorded as pair(objecttree)where object is the stacked item and tree is either another tuple pair giving the rest of the stack or none to designate an empty stack stack of items [ , , , would be internally stored as tuple tree ( ,( ,( ( ,none)))this tuple-based representation is similar to the notion of "cons-cellsin lisp-family languagesthe object on the left is the carand the rest of the tree on the right is the cdr because we add or remove only top tuple to push and pop itemsthis structure avoids copying the entire stack for large stacksthe benefit might be significant the next classshown in example - implements these ideas example - pp \dstruct\basic\stack py "optimize with tuple pair treesclass stackdef __init__(selfstart=[])self stack none for in range(-len(start) )self push(start[- ]init from any sequence even other (fast)stacks push in reverse order def push(selfnode)self stack nodeself stack grow tree 'up/leftnew root tuple(nodetreedef pop(self)nodeself stack self stack return node remove root tuple typeerror if empty def empty(self)return not self stack is it 'none'def __len__(self)lentree self stack while treelentree len+ tree[ return len onlennot def __getitem__(selfindex)lentree self stack while len index and treelentree len+ tree[ if treereturn tree[ elseonx[ ]infor data structures visit right subtrees visit/count nodes indexerror if out-of-bounds |
5,933 | so 'inand 'forstop def __repr__(self)return '[faststack:repr(self stack']this class' __getitem__ method handles indexingin testsand for loop iteration as before (when no __iter__ is defined)but this version has to traverse tree to find node by index notice that this isn' subclass of the original stack class since nearly every operation is implemented differently hereinheritance won' really help but clients that restrict themselves to the operations that are common to both classes can still use them interchangeably--they just need to import stack class from different module to switch implementations here' session with this stack versionas long as we stick to pushingpoppingindexingand iteratingthis version is essentially indistinguishable from the originalfrom stack import stack stack( stack(for in 'spam' push(cfor in range( ) push(ix [faststack:(' '(' '(' '(' 'none)))) [faststack:( ( ( none)))len( ) [ ] [- ( ' '' ' pop('mx [faststack:(' '(' '(' 'none)))while yprint( pop()end=' optimizationin-place list modifications the last section tried to speed up pushes and pops with different data structurebut we might also be able to speed up our stack object by falling back on the mutability of python' list object because lists can be changed in placethey can be modified more quickly than any of the prior examples in-place change operations such as append are prone to complications when list is referenced from more than one place but because the list inside the stack object isn' meant to be used directlywe're probably safe the module in example - shows one way to implement stack with in-place changessome operator overloading methods have been dropped to keep this simple the python pop method it uses is equivalent to indexing and deleting the item at implementing stacks |
5,934 | incurs some performance degradation for the extra method callsbut it supports future changes better by encapsulating stack operations example - pp \dstruct\basic\stack py "optimize with in-place list operationsclass error(exception)pass when importedlocal exception class stackdef __init__(selfstart=[])self stack [for in startself push(xself is the instance object start is any sequencestack def push(selfobj)self stack append(objmethodslike module self top is end of list def pop(self)if not self stackraise error('underflow'return self stack pop(like fetch and delete stack[- def top(self)if not self stackraise error('underflow'return self stack[- def empty(self)return not self stack instance empty(def __len__(self)return len(self stacklen(instance)not instance def __getitem__(selfoffset)return self stack[offsetinstance[offset]infor def __repr__(self)return '[stack:% ]self stack this version works like the original in module stack toojust replace stack with stack in the previous interaction to get feel for its operation the only obvious difference is that stack items are in reverse when printed ( the top is the end)from stack import stack stack( push('spam' push( [stack:['spam' ] stack( push( push( pop()xy data structures |
5,935 | top( timing the improvements the prior section' in-place changes stack object probably runs faster than both the original and the tuple-tree versionsbut the only way to really be sure is to time the alternative implementations since this could be something we'll want to do more than oncelet' first define general module for timing functions in python in example - the built-in time module provides clock function that we can use to get the current cpu time in floating-point secondsand the function timer test simply calls function reps times and returns the number of elapsed seconds by subtracting stop from start times example - pp \dstruct\basic\timer py "generic code timer tooldef test(repsfunc*args)import time start time clock(for in range(reps)func(*argsreturn time clock(start or best of nsee learning python current cpu time in float seconds call function reps times discard any return value stop time start time there are other ways to time codeincluding best-of- approach and python' own timeit modulebut this module suffices for our purpose here if you're interested in doing bettersee learning pythonfourth editionfor larger case study on this topicor experiment on your own nextwe define test driver script which deploys the timerin example - it expects three command-line argumentsthe number of pushespopsand indexing operations to perform (we'll vary these arguments to test different scenarioswhen run at the top levelthe script creates instances of the original and optimized stack classes and performs the specified number of operations on each stack pushes and pops change the stackindexing just accesses it example - pp \dstruct\basic\stacktime py "compare performance of stack alternativesimport stack import stack list-based stacks[ ]+ tuple-tree stacks( ,ybecause python is so dynamicguesses about relative performance in python are just as likely to be wrong as right moreovertheir accuracy is prone to change over time trust me on this 've made sweeping statements about performance in other booksonly to be made wrong by later python release that optimized some operations more than others performance measurement in python is both nontrivial and an ongoing task in generalcode for readability firstand worry about performance laterbut always gather data to support your optimization efforts implementing stacks |
5,936 | import timer in-place stacksy append(xgeneral function timer utility rept from sys import argv pushespopsitems (int(argfor arg in argv[ :]def stackops(stackclass) stackclass('spam'for in range(pushes) push(ifor in range(items) [ifor in range(pops) pop(make stack object exercise its methods xrange generator or mod __import__(nfor mod in (stack stack stack )rept*(push+pop+ixprint('% :mod __name__end='print(timer test(reptstackopsgetattr(mod'stack'))results under python the following are some of the timings reported by the test driver script the three outputs represent the measured run times in seconds for the originaltupleand inplace stacks for each stack typethe first test creates stack objects and performs roughly , stack operations ( repetitions ( pushes indexes pops)in the test duration times listed these results were obtained on fairly slow windows netbook laptop under python as usual for benchmarksyour mileage will probably vary :\pp \dstruct\basicpython stacktime py stack stack stack :\pp \dstruct\basicpython stacktime py stack stack stack :\pp \dstruct\basicpython stacktime py stack stack stack :\pp \dstruct\basicpython stacktime py stack stack stack if you look closely enoughyou'll notice that the results show that the tuple-based stack (stack performs better when we do more pushing and poppingbut worse if we do much indexing indexing lists is extremely fast for built-in lists (stack and stack )but very slow for tuple trees--the python class must traverse the tree manually data structures |
5,937 | done at all--tuples (stack win by hair in the last test case when there is no indexingas in the last testthe tuple and in-place change stacks are roughly six and five times quicker than the simple list-based stackrespectively since pushes and pops are most of what clients would normally do to stacktuples are contender heredespite their poor indexing performance of coursewe're talking about fractions of second after many tens of thousands of operationsin many applicationsyour users probably won' care either way if you access stack millions of times in your programthoughthis difference may accumulate to significant amount of time more on performance analysis two last notes on performance here although absolute times have changed over the years with new pythons and test machinesthese results have remained relatively the same that istuple-based stacks win when no indexing is performed all performance measurements in dynamic language like python are prone to change over timethoughso run such tests on your own for more accurate results secondthere' often more to performance measurement than timing alternatives this way for more complete pictureread about python' standard library profile module (and its optimized workalikecprofilethe profilers run python codecollect performance data along the wayand provide it in report on code exit it' the most complete way to isolate your code' bottleneckbefore you start working on optimizing with better codingalgorithmsand data structures or moving portions to the language for simple performance analysisthoughour timing module provides the data we need in factwe'll reuse it to measure more dramatic improvement in relative speed for set implementation alternatives--the topic of the next section implementing sets another commonly used data structure is the seta collection of objects that support operations such asintersection make new set with all items in common union make new set with all items in either operand membership test whether an item exists in set other operationssuch as difference and subset testscan be useful as welldepending on the intended use sets come in handy for dealing with abstract group combinations implementing sets |
5,938 | who do both activities by intersecting the two sets union of such sets would contain either type of individualbut would include any given individual only once this latter property also makes sets ideal for removing duplicates from collections--simply convert to and from set to filter out repeats in factwe relied on such operations in earlier pymailgui in for exampleused intersectionunionand difference to manage the set of active mail downloadsand filtered out duplicate recipients in multiple contexts with set conversion sets are widely relevant tool on practical programs built-in options if you've studied the core python languageyou should already know thatas for stackspython comes with built-in support here as well herethoughthe support is even more direct--python' set datatype provides standard and optimized set operations today as quick reviewbuilt-in set usage is straightforwardset objects are initially created by calling the type name with an iterable or sequence giving the components of the set or by running set comprehension expressionx set('abcde' { for in 'bdxyz' {' '' '' '' '' ' {' '' '' '' '' 'make set from an iterable/sequence same via set comprehension expression once you have setall the usual operations are availablehere are the most common'ein membership true difference {' '' '' ' intersection {' '' ' union {' '' '' '' '' '' '' '' 'interestinglyjust like the dictionariesbuilt-in sets are unorderedand require that all set components be hashable (immutablemaking set with dictionary of items worksbut only because set uses the dictionary iteratorwhich returns the next key on each iteration (it ignores key values) set(['spam''ham''eggs'] {'eggs''ham''spam' {'spam''ham''eggs' {'eggs''ham''spam'sequence of immutables set([['spam''ham']['eggs']]typeerrorunhashable type'listimmutables do not work as items data structures same but set literal if items known |
5,939 | {'eggs''ham''spam'plus there are additional operations we won' illuminate here--see core language text such as learning python for more details for instancebuilt-in sets also support operations such as superset testingand they come in two flavorsmutable and frozen (frozen sets are hashableand thus usable in sets of setsmoreoverset comprehensions are more powerful than suggestedand sets are natural at duplicate removaly { upper( for in 'spamham' {'ssss''aaaa''mmmm''hhhh''pppp'list(set([ ])[ set comprehension remove duplicates from list as for stacksthoughthe built-in set type might not by itself achieve all our goals moreoverhomegrown set implementations turn out to be an ideal vehicle for studying custom data structure implementations in python although the end result may not compete with the performance of built-in set objects todaythe code can still be instructive to readand fun to experiment with also as for stacksa custom set implementation will generally be based upon other built-in types python liststuplesand strings come close to the notion of setthe in operator tests membershipfor iteratesand so on herewe'll add operations not directly supported by python sequences in effectwe're extending built-in types for unique requirements set functions as beforelet' first start out with function-based set manager but this timeinstead of managing shared set object in modulelet' define functions to implement set operations on passed-in python sequences (see example - example - pp \dstruct\basic\inter py "set operations for two sequencesdef intersect(seq seq )res [for in seq if in seq res append(xreturn res def union(seq seq )res list(seq for in seq if not in resstart with an empty list scan the first sequence add common items to the end make copy of seq add new items in seq implementing sets |
5,940 | return res these functions work on any type of sequence--lists stringstuplesand other iterable objects that conform to the protocols expected by these functions (for loopsin membership testsin factwe can even use them on mixed object typesthe last two commands in the following test compute the intersection and union of list and tuple as usual in pythonthe object interface is what mattersnot the specific typesc:\pp \dstruct\basicpython from inter import "spams "scamintersect( )union( ([' '' '' '][' '' '' '' '' ']intersect([ , , ]( , )[ union([ , , ]( , )[ notice that the result is always list hereregardless of the type of sequences passed in we could work around this by converting types or by using class to sidestep this issue (and we will in momentbut type conversions aren' clear-cut if the operands are mixed-type sequences which type do we convert tosupporting multiple operands if we're going to use the intersect and union functions as general toolsone useful extension is support for multiple arguments ( more than twothe functions in example - use python' variable-length argument lists feature to compute the intersection and union of arbitrarily many operands example - pp \dstruct\basic\inter py "set operations for multiple sequencesdef intersect(*args)res [for in args[ ]for other in args[ :]if not in otherbreak elseres append(xreturn res def union(*args)res [for seq in argsfor in seqif not in resres append(xreturn res data structures scan the first list for all other arguments this item in each oneadd common items to the end for all sequence-arguments for all nodes in argument add new items to result |
5,941 | but they also support three or more operands notice the use of an else on the intersection' for loop here to detect common items also note that the last two examples in the following session work on lists with embedded compound objectsthe in tests used by the intersect and union functions apply equality testing to sequence nodes recursivelyas deep as necessary to determine collection comparison resultsc:\pp \dstruct\basicpython from inter import 'spam''slam''scamintersect( [' '' '' 'intersect( [' '' '' 'intersect( 'ham'[' '' 'union( )union( ([' '' '' '' '' '][' '' '' '' '' '' ']intersect([ ]( )range( )[ ( ( )"bye"[ ]"mello" [[ ]"hello"( ) intersect( [ [ ]union( [ ( )'bye'[ ]'mello''hello'( ) xrange okay set classes the preceding section' set functions can operate on variety of objectsbut they aren' as friendly as true objects among other thingsyour scripts need to keep track of the sequences passed into these functions manually classes can do betterthe class in example - implements set object that can hold any type of object like the stack classesit' essentially wrapper around python list with extra set operations example - pp \dstruct\basic\set py "multi-instancecustomizableencapsulated set classclass setdef __init__(selfvalue [])self data [self concat(valuedef intersect(selfother)res [for in self dataif in otherres append(xreturn set(reson object creation manages local list other is any sequence type self is the instance subject return new set implementing sets |
5,942 | res self data[:for in otherif not in resres append(xreturn set(resdef concat(selfvalue)for in valueif not in self dataself data append(xmake copy of my list valuea liststringset filters out duplicates def __len__(self)return len(self datadef __getitem__(selfkey)return self data[keydef __and__(selfother)return self intersect(otherdef __or__(selfother)return self union(otherdef __repr__(self)return 'the set class is used like the stack class we saw earlier in this we make instances and apply sequence operators plus unique set operations to them intersection and union can be called as methodsor by using the and operators normally used for built-in integer objects because we can string operators in expressions now ( )there is no obvious need to support multiple operands in intersect/union methods here (though this model' need to create temporary objects within expressions might eventually come to bear on performanceas with all rightly packaged objectswe can either use the set class within program or test it interactively as followsfrom set import set users set(['bob''emily''howard''peeper']users set(['jerry''howard''carol']users set(['emily''carol']users users users users users users users (users users users users data ['bob''emily''howard''peeper'optimizationmoving sets to dictionaries once you start using the set classthe first problem you might encounter is its performanceits nested for loops and in scans become exponentially slow that slowness may or may not be significant in your applicationsbut library classes should generally be coded as efficiently as possible one way to optimize set performance is by changing the implementation to use dictionaries rather than lists for storing sets internally--items may be stored as the keys data structures |
5,943 | dictionariesthe in list scans of the original set can be replaced with direct dictionary fetches in this scheme in traditional termsmoving sets to dictionaries replaces slow linear searches with fast hashtable fetches computer scientist would explain this by saying that the repeated nested scanning of the list-based intersection is an exponential algorithm in terms of its complexitybut dictionaries can be linear the module in example - implements this idea its class is subclass of the original setand it redefines the methods that deal with the internal representation but inherits others the inherited and methods trigger the new intersect and union methods hereand the inherited len method works on dictionaries as is as long as set clients are not dependent on the order of items in setmost can switch to this version directly by just changing the name of the module from which set is importedthe class name is the same example - pp \dstruct\basic\fastset py "optimize with linear-time scans using dictionariesimport set fastset set extends set set class set(set set)def __init__(selfvalue [])self data {self concat(valuedef intersect(selfother)res {for in otherif in self datares[xnone return set(res keys()manages local dictionary hashinglinear search times othera sequence or set use hash-table lookup def union(selfother)res {for in otherres[xnone for in self data keys()res[xnone return set(res keys() new dictionary-based set othera sequence or set scan each set just once '&and '|come back here so they make new fastset' def concat(selfvalue)for in valueself data[xnone inherit andorlen def __getitem__(selfix)return list(self data keys())[ix xlist(def __repr__(self)return 'list(self data keys()ditto implementing sets |
5,944 | from fastset import set users set(['bob''emily''howard''peeper']users set(['jerry''howard''carol']users set(['emily''carol']users users users users users users users (users users users users data {'peeper'none'bob'none'howard'none'emily'nonethe main functional difference in this version is the order of items in the setbecause dictionaries are randomly orderedthis set' order will differ from the original the order of results can even vary across python releases (in fact it didbetween python and in the third and fourth editions of this bookfor instanceyou can store compound objects in setsbut the order of items varies in this versionimport setfastset set set([( , )( , )( , )] set set([( , )( , )] fastset set([( , )( , )( , )] fastset set([( , )( , )] sets aren' supposed to be ordered anyhowso this isn' showstopper deviation that might matterthoughis that this version cannot be used to store unhashable (that isimmutableobjects this stems from the fact that dictionary keys must be immutable because values are stored in keysdictionary sets can contain only things such as tuplesstringsnumbersand class objects with immutable signatures mutable objects such as lists and dictionaries won' work directly in this dictionary-based setbut do in the original set class tuples do work here as compound set itemsthoughbecause they are immutablepython computes hash values and tests key equality as expectedset set([[ ],[ ]]fastset set([[ ],[ ]]typeerrorunhashable type'list data structures |
5,945 | fastset set([( )( )]timing the results under python so how did we do on the optimization front this timeagainguesses aren' usually good enoughthough algorithmic complexity seems compelling piece of evidence here to be sureexample - codes script to compare set class performance it reuses the timer module of example - used earlier to compare stacks (our code may implement different objectsbut it doesn' warp timeexample - pp \dstruct\basic\settime py "compare set alternatives performanceimport timersys import setfastset def setops(class) class(range( ) class(range( ) class(range( ) class(range( )for in range( ) xrange okay -integer set -integer set intersections unions if __name__ ='__main__'rept int(sys argv[ ]print('set ='timer test(reptsetopsset set)print('fastset =>'timer test(reptsetopsfastset set)the setops function makes four sets and combines them with intersection and union operators five times command-line argument controls the number of times this whole process is repeated more accuratelyeach call to setops makes set instances ( [ ( )]and runs the intersect and union methods times each ( in the for loop' body the performance improvement is equally dramatic this time aroundon the same windows laptop under python :\pp \dstruct\basicpython settime py set = fastset = :\pp \dstruct\basicpython settime py set = fastset = :\pp \dstruct\basicpython settime py set = fastset = implementing sets |
5,946 | for this specific test casethe dictionary-based set implementation (fastestis roughly three times faster than the simple list-based set (setin factthis threefold speedup is probably sufficient python dictionaries are already optimized hashtables that you might be hard-pressed to improve on unless there is evidence that dictionary-based sets are still too slowour work here is probably done by comparisonresults for python in the prior edition of this book showed fastest to be six times faster than set in all cases either iteration operations sped upor dictionary operations slowed down in in the even older python and second editionthe relative results were the same as they are today in python in any eventthis well underscores the fact that you must test performance on your machine and your python--today' python performance observation may easily be tomorrow' historic anecdote adding relational algebra to sets (externalif you are interested in studying additional set-like operations coded in pythonsee the following files in this book' examples distributionpp \dstruct\basic\rset py rset implementation pp \dstruct\basic\reltest py test script for rsetits expected output is in reltest results txt the rset subclass defined in rset py adds basic relational algebra operations for sets of dictionaries it assumes the items in sets are mappings (rows)with one entry per column (fieldrset inherits all the original set operations (iterationintersectionunionand operatorsuniqueness filteringand so on)and adds new operations as methodsselect return set of nodes that have field equal to given value bagof collect set nodes that satisfy an expression string find select tuples according to comparisonfieldand value match find nodes in two sets with the same values for common fields product compute cartesian productconcatenate tuples from two sets join combine tuples from two sets that have the same value for field data structures |
5,947 | extract named fields from the tuples in table difference remove one set' tuples from another these operations go beyond the tools provided by python' built-in set objectand are prime example of why you may wish to implement custom set type in the first place although have ported this code to run under python xi have not revisited it in any sort of depth for this editionbecause today would probably prefer to implement it as subclass of the built-in set typerather than part of proprietary set implementation coincidentallythat leads us to our next topic subclassing built-in types before we move on to other classical data structuresthere is one more twist in the stack and set story in recent python releasesit is also possible to subclass built-in datatypes such as lists and dictionariesin order to extend them that isbecause datatypes are now themselves customizable classeswe can code unique datatypes that are extensions of built-inswith subclasses that inherit built-in tool sets this is especially true in python xwhere "typeand "classhave become veritable synonyms altogether to demonstrateexample - shows the main parts of module containing variants of our stack and set objects coded in the prior sectionsrevised as customized lists for varietythe set union method has also been simplified slightly here to remove redundant loop example - pp \dstruct\basic\typesubclass py "customize built-in types to extendinstead of starting from scratchclass stack(list)" list with extra methodsdef top(self)return self[- def push(selfitem)list append(selfitemdef pop(self)if not selfreturn none elsereturn list pop(selfavoid exception class set(list) list with extra methods and operatorsdef __init__(selfvalue=[])on object creation list __init__(selfself concat(valuesubclassing built-in types |
5,948 | res [for in selfif in otherres append(xreturn set(resother is any sequence type self is the instance subject def union(selfother)res set(selfres concat(otherreturn res return new set def concat(selfvalue)for in valueif not in selfself append(xnew set with copy of my list insert uniques from other valuea liststringset filters out duplicates lengetitemiter inheriteduse list repr def __and__(selfother)return self intersect(otherdef __or__(selfother)return self union(otherdef __str__(self)return 'class fastset(dict)pass this doesn' simplify much self-test code omittedsee examples package file the stack and set implemented in this code are essentially like those we saw earlierbut instead of embedding and managing listthese objects really are customized lists they add few additional methodsbut they inherit all of the list object' functionality this can reduce the amount of wrapper code requiredbut it can also expose functionality that might not be appropriate in some cases as codedfor examplewe're able to sort and insert into stacks and reverse setbecause we've inherited these methods from the built-in list object in most casessuch operations don' make sense for these data structuresand barring extra code that disables such nonfeaturesthe wrapper class approach of the prior sections may still be preferred for more on the class subtype classessee the remainder of their implementation file in the examples package for self-test code and its expected output because these objects are used in the same way as our original stacks and setsinteracting with them is left as suggested exercise here subclassing built-in types has other applicationswhich may be more useful than those demonstrated by the preceding code consider queueor ordered dictionaryfor example the queue could take the form of list subclass with get and put methods to insert on one end and delete from the otherthe dictionary could be coded as dictionary subclass with an extra list of keys that is sorted on insertion or request while this scheme works well for types that resemble built-insthoughtype subclasses may data structures |
5,949 | sections binary search trees binary trees are data structure that impose an order on inserted nodesitems less than node are stored in the left subtreeand items greater than node are inserted in the right at the bottomthe subtrees are empty because of this structurebinary trees naturally support quickrecursive traversalsand hence fast lookup and search in wide variety of applications--at least ideallyevery time you follow link to subtreeyou divide the search space in half built-in options here toopython supports search operations with built-in tools dictionariesfor examplealready provide highly optimizedc-coded search table tool in factindexing dictionary by key directly is likely to be faster than searching python-coded equivalentx {empty dict for in [ ] [inone insert { none none none none nonefor in range( )print((ii in )end='lookup ( false( true( true( true( false( false( false( truebecause dictionaries are built into the languagethey are always available and will usually be faster than python-based data structure implementations built-in sets can often offer similar functionality--in factit' not too much of an abstraction to think of sets as valueless dictionariesx set(empty set for in [ ] add(iinsert { for in range( )print((ii in )end='lookup ( false( true( true( true( false( false( false( truein factthere are variety of ways to insert items into both sets and dictionariesboth are useful for checking if key is storedbut dictionaries further allow search keys to have associated valuesv [ { for in vset comprehension binary search trees |
5,950 | set( { set constructor {kk+ for in { dict(zip( [ len( )){ dict fromkeys( { dict comprehension dict constructor dict method so why bother with custom search data structure implementation heregiven such flexible built-insin some applicationsyou might notbut here especiallya custom implementation often makes sense to allow for customized tree algorithms for instancecustom tree balancing can help speed lookups in pathological casesand might outperform the generalized hashing algorithms used in dictionaries and sets moreoverthe same motivations we gave for custom stacks and sets apply here as well--by encapsulating tree access in class-based interfaceswe support future extension and change in more manageable ways implementing binary trees binary trees are named for the implied branch-like structure of their subtree links typicallytheir nodes are implemented as triple of values(leftsubtreenodevaluerightsubtreebeyond thatthere is fairly wide latitude in the tree implementation here we'll use class-based approachbinarytree is header objectwhich initializes and manages the actual tree emptynode is the empty objectshared at all empty subtrees (at the bottombinarynode objects are nonempty tree nodes with value and two subtrees instead of coding distinct search functionsbinary trees are constructed with "smartobjects--class instances that know how to handle insert/lookup and printing requests and pass them to subtree objects in factthis is another example of the object-oriented programming (oopcomposition relationship in actiontree nodes embed other tree nodes and pass search requests off to the embedded subtrees single empty class instance is shared by all empty subtrees in binary treeand inserts replace an empty node with binarynode at the bottom example - shows what this means in code example - pp \dstruct\classics\btree py " valueless binary search treeclass binarytreedef __init__(self)def __repr__(self)def lookup(selfvalue)def insert(selfvalue) data structures self tree emptynode(return repr(self treereturn self tree lookup(valueself tree self tree insert(value |
5,951 | def __repr__(self)return '*def lookup(selfvalue)return false def insert(selfvalue)return binarynode(selfvalueselfclass binarynodedef __init__(selfleftvalueright)self dataself leftself right fail at the bottom add new node at bottom valueleftright def lookup(selfvalue)if self data =valuereturn true elif self data valuereturn self left lookup(valueelsereturn self right lookup(valuedef insert(selfvalue)if self data valueself left self left insert(valueelif self data valueself right self right insert(valuereturn self look in left look in right grow in left grow in right def __repr__(self)return ('% % % )(repr(self left)repr(self data)repr(self right))as usualbinarytree can contain objects of any type that support the expected interface protocol--hereand comparisons this includes class instances with the __lt__ and __gt__ methods let' experiment with this module' interfaces the following code stuffs five integers into new treeand then searches for values as we did earlier for dictionaries and setsc:\pp \dstruct\classicspython from btree import binarytree binarytree(for in [ ] insert(ifor in range( )print((ix lookup( ))end='( false( true( true( true( false( false( false( trueto watch this tree growadd print call statement after each insert tree nodes print themselves as triplesand empty nodes print as the result reflects tree nestingy binarytree( for in [ ] insert( )print(ybinary search trees |
5,952 | * ) * ) * * * ) * * * ) * ) at the end of this we'll see another way to visualize such trees in gui named pytree (you're invited to flip ahead now if you prefernode values in this tree object can be any comparable python object--for instancehere is tree of stringsz binarytree(for in 'badce' insert(cz *' ')' '*' ')' '*' ' binarytree(for in 'abcde' insert(cz *' '*' '*' '*' '*' ' binarytree(for in 'edcba' insert(cz *' ')' ')' ')' ')' 'notice the last tests hereif items inserted into binary tree are already orderedyou wind up with linear structure and lose the search-space partitioning magic of binary trees (the tree grows in right or left branches onlythis is worst-case scenarioand binary trees generally do good job of dividing values in practice but if you are interested in pursuing this topic furthersee data structures text for tree-balancing techniques that automatically keep the tree as dense as possible but are beyond our scope here trees with both keys and values also note that to keep the code simplethese trees store value only and lookups return true or false result in practiceyou sometimes may want to store both key and an associated value (or even moreat each tree node example - shows what such tree object looks likefor any prospective lumberjacks in the audience example - pp \dstruct\classics\btreevals py " binary search tree with values for stored keysclass keyedbinarytreedef __init__(self)def __repr__(self)def lookup(selfkey)def insert(selfkeyval) data structures self tree emptynode(return repr(self treereturn self tree lookup(keyself tree self tree insert(keyval |
5,953 | def __repr__(self)return '*def lookup(selfkey)return none def insert(selfkeyval)return binarynode(selfkeyvalselffail at the bottom add node at bottom class binarynodedef __init__(selfleftkeyvalright)self keyself val keyval self leftself right leftright def lookup(selfkey)if self key =keyreturn self val elif self key keyreturn self left lookup(keyelsereturn self right lookup(keydef insert(selfkeyval)if self key =keyself val val elif self key keyself left self left insert(keyvalelif self key keyself right self right insert(keyvalreturn self look in left look in right grow in left grow in right def __repr__(self)return ('% % =% % )(repr(self left)repr(self key)repr(self val)repr(self right))if __name__ ='__main__' keyedbinarytree(for (keyvalin [('bbb' )('aaa' )('ccc' )] insert(keyvalprint(tprint( lookup('aaa') lookup('ccc') insert('ddd' insert('aaa' changes key' value print(tthe following shows this script' self-test code at worknodes simply have more content this time aroundc:\pp \dstruct\classicspython btreevals py *'aaa'= )'bbb'= *'ccc'= *'aaa'= )'bbb'= *'ccc'= *'ddd'= in factthe effect is similar to the keys and values of built-in dictionarybut custom tree structure like this might support custom use cases and algorithmsas well as code binary search trees |
5,954 | built-in gangthoughwe need to move on to the next section graph searching many problems that crop up in both real life and real programming can be fairly represented as graph-- set of states with transitions ("arcs"that lead from one state to another for exampleplanning route for trip is really graph search problem in disguisethe states are places you' like to visitand the arcs are the transportation links between them program that searches for trip' optimal route is graph searcher for that matterso are many programs that walk hyperlinks on the web this section presents simple python programs that search through directedcyclic graph to find the paths between start state and goal graphs can be more general than trees because links may point at arbitrary nodes--even ones already searched (hence the word cyclicmoreoverthere isn' any direct built-in support for this type of goalalthough graph searchers may ultimately use built-in typestheir search routines are custom enough to warrant proprietary implementations the graph used to test searchers in this section is sketched in figure - arrows at the end of arcs indicate valid paths ( leads to beand gthe search algorithms will traverse this graph in depth-first fashionand they will trap cycles in order to avoid looping if you pretend that this is mapwhere nodes represent cities and arcs represent roadsthis example will probably seem bit more meaningful figure - directed graph implementing graph search the first thing we need to do is choose way to represent this graph in python script one approach is to use built-in datatypes and searcher functions the file in example - builds the test graph as simple dictionaryeach state is dictionary key data structures |
5,955 | we'll use to run few searches in the graph example - pp \dstruct\classics\gtestfunc py "dictionary based graph representationgraph {' '' '' '' '' '' '' '[' '' '' '][' '][' '' '][' '][' '' '' ']][' ' directedcyclic graph stored as dictionary 'keyleads-to [nodesdef tests(searcher)test searcher function print(searcher(' '' 'graph)find all paths from 'eto 'dfor in ['ag''gf''ba''da']print(xsearcher( [ ] [ ]graph)nowlet' code two modules that implement the actual search algorithms they are both independent of the graph to be searched (it is passed in as an argumentthe first searcherin example - uses recursion to walk through the graph example - pp \dstruct\classics\gsearch py "find all paths from start to goal in graphdef search(startgoalgraph)solns [generate([start]goalsolnsgraphsolns sort(key=lambda xlen( )return solns collect paths sort by path length def generate(pathgoalsolnsgraph)state path[- if state =goalfound goal here solns append(pathchange solns in-place elsecheck all arcs here for arc in graph[state]skip cycles on path if arc not in pathgenerate(path [arc]goalsolnsgraphif __name__ ='__main__'import gtestfunc gtestfunc tests(searchthe second searcherin example - uses an explicit stack of paths to be expanded using the tuple-tree stack representation we explored earlier in this graph searching |
5,956 | "graph searchusing paths stack instead of recursiondef search(startgoalgraph)solns generate(([start][])goalgraphsolns sort(key=lambda xlen( )return solns def generate(pathsgoalgraph)solns [while pathsfrontpaths paths state front[- if state =goalsolns append(frontelsefor arc in graph[state]if arc not in frontpaths (front [arc])paths return solns returns solns list use tuple-stack pop the top path goal on this path add all extensions if __name__ ='__main__'import gtestfunc gtestfunc tests(searchto avoid cyclesboth searchers keep track of nodes visited along path if an extension is already on the current pathit is loop the resulting solutions list is sorted by increasing lengths using the list sort method and its optional key value transform argument to test the searcher modulessimply run themtheir self-test code calls the canned search test in the gtestfunc modulec:\pp \dstruct\classicspython gsearch py [[' '' '' '][' '' '' '' '' '' ']ag [[' '' '][' '' '' '][' '' '' '' '' ']gf [[' '' '' '' '][' '' '' '' '' '' '][' '' '' '' '' '' '][' '' '' '' '' '' ']ba [[' '' '' '' '' ']da [ :\pp \dstruct\classicspython gsearch py [[' '' '' '][' '' '' '' '' '' ']ag [[' '' '][' '' '' '][' '' '' '' '' ']gf [[' '' '' '' '][' '' '' '' '' '' '][' '' '' '' '' '' '][' '' '' '' '' '' ']ba [[' '' '' '' '' ']da [this output shows lists of possible paths through the test graphi added two line breaks to make it more readable (python pprint pretty-printer module might help with readability here as wellnotice that both searchers find the same paths in all testsbut the order in which they find those solutions may differ the gsearch order depends on data structures |
5,957 | and code to see how moving graphs to classes using dictionaries to represent graphs is efficientconnected nodes are located by fast hashing operation but depending on the applicationother representations might make more sense for instanceclasses can be used to model nodes in networktoomuch like the binary tree example earlier as classesnodes may contain content useful for more sophisticated searches they may also participate in inheritance hierarchiesto acquire additional behaviors to illustrate the basic ideaexample - shows an alternative coding for our graph searcherits algorithm is closest to gsearch example - pp \dstruct\classics\graph py "build graph with objects that know how to searchclass graphdef __init__(selflabelextra=none)self name label self data extra self arcs [nodes=inst objects graph=linked objs def __repr__(self)return self name def search(selfgoal)graph solns [self generate([self]goalgraph solns sort(key=lambda xlen( )return graph solns def generate(selfpathgoal)if self =goalgraph solns append(pathelsefor arc in self arcsif arc not in patharc generate(path [arc]goalclass =tests addr or self solnssame in this versiongraphs are represented as network of embedded class instance objects each node in the graph contains list of the node objects it leads to (arcs)which it knows how to search the generate method walks through the objects in the graph but this timelinks are directly available on each node' arcs listthere is no need to index (or passa dictionary to find linked objects the search is effectively spread across the graph' linked objects to testthe module in example - builds the test graph againthis time using linked instances of the graph class notice the use of exec in this codeit executes dynamically graph searching |
5,958 | =graph(' ')and so onexample - pp \dstruct\classics\gtestobj py "build class-based graph and run test searchesfrom graph import graph this doesn' work inside def in undefined for name in "abcdefg"exec("% graph('% ')(namename) arcs [begb arcs [cc arcs [ded arcs [fe arcs [cfgg arcs [amake objects first label=variable-name now configure their linksembedded class-instance list search(gfor (startstopin [( , )( , )( , )( , )( , )]print(start search(stop)running this test executes the same sort of graph walkingbut this time it' routed through object methodsc:\pp \dstruct\classicspython gtestobj py [[ecd][egabcd][[ag][aeg][abceg][[gaef][gabcdf][gabcef][gaecdf][[bcega][the results are the same as for the functionsbut node name labels are not quotednodes on path lists here are graph instancesand this class' __repr__ scheme suppresses quotes example - is one last graph test before we move onsketch the nodes and arcs on paper if you have more trouble following the paths than python example - pp \dstruct\classics\gtestobj py from graph import graph graph(' ' graph(' ' graph(' ' graph(' ' arcs [pmp arcs [smaa arcs [mprint( search( ) graph of spam make node objects leads to and arcsembedded objects find all paths from to data structures |
5,959 | scratched the surface of this academic yet useful domain hereexperiment further on your ownand see other books for additional topics ( breadth-first search by levelsand best-first search by path or state scores) :\pp \dstruct\classicspython gtestobj py [[sm][spm][spam]permuting sequences our next data structure topic implements extended functionality for sequences that is not present in python' built-in objects the functions defined in example - shuffle sequences in number of wayspermute constructs list with all valid permutations of any sequence subset constructs list with all valid permutations of specific length combo works like subsetbut order doesn' matterpermutations of the same items are filtered out these results are useful in variety of algorithmssearchesstatistical analysisand more for instanceone way to find an optimal ordering for items is to put them in listgenerate all possible permutationsand simply test each one in turn all three of the functions make use of generic sequence slicing so that the result list contains sequences of the same type as the one passed in ( when we permute stringwe get back list of stringsexample - pp \dstruct\classics\permcomb py "permutation-type operations for sequencesdef permute(list)if not listreturn [listelseres [for in range(len(list))rest list[:ilist[ + :for in permute(rest)res append(list[ : + xreturn res def subset(listsize)if size = or not listreturn [list[: ]elseresult [shuffle any sequence empty sequence delete current node permute the others add node at front order matters here an empty sequence permuting sequences |
5,960 | pick list[ : + rest list[:ilist[ + :for in subset(restsize- )result append(pick xreturn result def combo(listsize)if size = or not listreturn [list[: ]elseresult [for in range( (len(listsize )pick list[ : + rest list[ + :for in combo(restsize )result append(pick xreturn result sequence slice keep [:ipart order doesn' matter xyz =yzx iff enough left drop [:ipart all three of these functions work on any sequence object that supports lenslicingand concatenation operations for instancewe can use permute on instances of some of the stack classes defined at the start of this (experiment with this on your own for more insightsthe following session shows our sequence shufflers in action permuting list enables us to find all the ways the items can be arranged for instancefor four-item listthere are possible permutations ( after picking one of the four for the first positionthere are only three left to choose from for the secondand so on order matters[ , , is not the same as [ , , ]so both appear in the resultc:\pp \dstruct\classicspython from permcomb import permute([ ][[ ][ ][ ][ ][ ][ ]permute('abc'['abc''acb''bac''bca''cab''cba'permute('help'['help''hepl''hlep''hlpe''hpel''hple''ehlp''ehpl''elhp''elph''ephl''eplh''lhep''lhpe''lehp''leph''lphe''lpeh''phel''phle''pehl''pelh''plhe''pleh'combo results are related to permutationsbut fixed-length constraint is put on the resultand order doesn' matterabc is the same as acbso only one is added to the result setcombo([ ] [[ ]combo('abc' ['abc'combo('abc' ['ab''ac''bc'combo('abc' [combo(( ) data structures |
5,961 | for in range( )print(icombo("help" ) ['' [' '' '' '' ' ['he''hl''hp''el''ep''lp' ['hel''hep''hlp''elp' ['help' [finallysubset is just fixed-length permutationsorder mattersso the result is larger than for combo in factcalling subset with the length of the sequence is identical to permutesubset([ ] [[ ][ ][ ][ ][ ][ ]subset('abc' ['abc''acb''bac''bca''cab''cba'for in range( )print(isubset("help" ) ['' [' '' '' '' ' ['he''hl''hp''eh''el''ep''lh''le''lp''ph''pe''pl' ['hel''hep''hle''hlp''hpe''hpl''ehl''ehp''elh''elp''eph''epl''lhe''lhp''leh''lep''lph''lpe''phe''phl''peh''pel''plh''ple' ['help''hepl''hlep''hlpe''hpel''hple''ehlp''ehpl''elhp''elph''ephl''eplh''lhep''lhpe''lehp''leph''lphe''lpeh''phel''phle''pehl''pelh''plhe''pleh' ['help''hepl''hlep''hlpe''hpel''hple''ehlp''ehpl''elhp''elph''ephl''eplh''lhep''lhpe''lehp''leph''lphe''lpeh''phel''phle''pehl''pelh''plhe''pleh'these are some fairly dense algorithms (and franklymay seem to require zen-like "moment of clarityto grasp completely)but they are not too obscure if you trace through few simple cases first they're also representative of the sort of operation that requires custom data structure code--unlike the last stop on our data structures tour in the next section reversing and sorting sequences the permutation utilities of the prior section are useful in variety of applicationsbut there are even more fundamental operations that might seem prime candidates for automation reversing and sorting collections of valuesfor exampleare core operations in broad range of programs to demonstrate coding techniquesand to provide examples that yield closure for recurring theme of this let' take quick look at both of these in turn reversing and sorting sequences |
5,962 | reversal of collections can be coded either recursively or iteratively in pythonand as functions or class methods example - is first attempt at two simple reversal functions example - pp \dstruct\classics\rev py def reverse(list)recursive if list =[]return [elsereturn reverse(list[ :]list[: def ireverse(list)iterative res [for in listres [xres return res both reversal functions work correctly on lists but if we try reversing nonlist sequences (stringstuplesand so on)the ireverse function always returns list for the result regardless of the type of sequence passedireverse("spam"[' '' '' '' 'much worsethe recursive reverse version won' work at all for nonlists--it gets stuck in an infinite loop the reason is subtlewhen reverse reaches the empty string ("")it' not equal to the empty list ([])so the else clause is selected but slicing an empty sequence returns another empty sequence (indexes are scaled)the else clause recurs again with an empty sequencewithout raising an exception the net effect is that this function gets stuck in loopcalling itself over and over again until python runs out of memory the versions in example - fix both problems by using generic sequence handling techniques much like that of the prior section' permutation utilitiesreverse uses the not operator to detect the end of the sequence and returns the empty sequence itselfrather than an empty list constant since the empty sequence is the type of the original argumentthe operation always builds the correct type sequence as the recursion unfolds ireverse makes use of the fact that slicing sequence returns sequence of the same type it first initializes the result to the slice [: ] newempty slice of the argument' type laterit uses slicing to extract one-node sequences to add to the result' frontinstead of list constant example - pp \dstruct\classics\rev py def reverse(list)if not listreturn list data structures empty(not always []the same sequence type |
5,963 | return reverse(list[ :]list[: def ireverse(list)res list[: for in range(len(list))res list[ : + res return res add front item on the end emptyof same type add each item to front the combination of the changes allows the new functions to work on any sequenceand return new sequence of the same type as the sequence passed in if we pass in stringwe get new string as the result in factthey reverse any sequence object that responds to slicingconcatenationand len--even instances of python classes and types in other wordsthey can reverse any object that has sequence interface protocols here they are working on listsstringsand tuplesfrom rev import reverse([ ])ireverse([ ]([ ][ ]reverse("spam")ireverse("spam"('maps''maps'reverse(( ))ireverse(( )(( )( )implementing sorts another staple of many systems is sortingordering items in collection according to some constraint the script in example - defines simple sort routine in pythonwhich orders list of objects on field because python indexing is genericthe field can be an index or key--this function can sort lists of either sequences or mappings example - pp \dstruct\classics\sort py def sort(listfield)res [for in listi for in resif [field< [field]break + res[ : [xreturn res always returns list list node goes hereinsert in result slot if __name__ ='__main__'table {'name':'john''age': }{'name':'doe''age': print(sort(table'name')print(sort(table'age')table ('john' )('doe' print(sort(table )print(sort(table )reversing and sorting sequences |
5,964 | last sort tuplesc:\pp \dstruct\classicspython sort py [{'age' 'name''doe'}{'age' 'name''john'}[{'age' 'name''john'}{'age' 'name''doe'}[('doe' )('john' )[('john' )('doe' )adding comparison functions since functions can be passed in like any other objectwe can easily allow for an optional comparison function in the next versionexample - the second argument takes function that should return true if its first argument should be placed before its second lambda is used to provide an ascending-order test by default this sorter also returns new sequence that is the same type as the sequence passed inby applying the slicing techniques used in the reversal tools earlier--if you sort tuple of nodesyou get back tuple example - pp \dstruct\classics\sort py def sort(seqfunc=(lambda ,yx < ))res seq[: for in range(len(seq)) for in resif func(seq[ ] )break + res res[:iseq[ : + res[ :return res defaultascending return seq' type seq can be immutable if __name__ ='__main__'table ({'name':'doe'}{'name':'john'}print(sort(list(table)(lambda xyx['name' ['name']))print(sort(tuple(table)(lambda xyx['name'< ['name']))print(sort('axbyzc')this timethe table entries are ordered per field comparison function passed inc:\pp \dstruct\classicspython sort py [{'name''john'}{'name''doe'}({'name''doe'}{'name''john'}abcxyz this version also dispenses with the notion of field altogether and lets the passed-in function handle indexing if needed that makes this version much more generalfor instanceit' also useful for sorting strings data structures versus built-insthe conclusion but now that 've shown you these reversing and sorting algorithmsi need to also tell you that they may not be an optimal approachespecially in these specific cases data structures |
5,965 | accomplish what we just did the hard waybuilt-in sorting tools python' two built-in sorting tools are so fast that you would be hard-pressed to beat them in most scenarios to use the list object' sort method for arbitrary kinds of iterablesconvert first if neededtemp list(sequencetemp sort(use items in temp alternativelythe sorted built-in function operates on any iterable so you don' need to convertand returns new sorted result list so you can use it within larger expression or context because it is not an in-place changeyou also don' need to be concerned about the possible side effects of changing the original listfor item in sorted(iterable)use item for custom sortssimply pass in the key keyword arguments to tailor the built-in sort' operation--it maps values to sort keys instead of performing comparisonsbut the effect is just as general (see the earlier graph searcherslength ordering for another example) [{' ': }{' ': }{' ': }{' ': } sort(key=lambda xx[' '] [{' ' }{' ' }{' ' }{' ' }both sorting tools also accept boolean reverse flag to make the result order descending instead of ascendingthere is no need to manually reverse after the sort the underlying sort routine in python is so good that its documentation claims that it has "supernatural performance"--not bad for sort built-in reversal tools our reversal routines are generally superfluous by the same token--because python provides for fast reversals in both in-place and iterable formsyou're likely better off using them whenever possiblel [ reverse( [ [ list(reversed( )[ in factthis has been recurring theme of this on purposeto underscore key point in python workalthough there are plenty of exceptionsyou're generally better reversing and sorting sequences |
5,966 | will often prove better choice in the end make no mistakesometimes you really do need objects that add functionality to builtin types or do something more custom the set classes we metfor instancecan add custom tools not directly supported by python todaybinary search trees may support some algorithms better than dictionaries and sets canand the tuple-tree stack implementation was actually faster than one based on built-in lists for common usage patterns moreovergraphs and permutations are something you must code on your own as we've also seenclass encapsulations make it possible to change and extend object internals without impacting the rest of your system although subclassing built-in types can address much of the same goalsthe end result is still custom data structure yet because python comes with set of built-inflexibleand optimized datatypesdata structure implementations are often not as important in python as they are in lesserequipped and lower-level programming languages before you code that new datatypebe sure to ask yourself whether built-in type or call might be more in line with the python way of thinking for more on extended data structures for use in pythonsee also the relatively new collections module in its standard library as mentioned in the preceding this module implements named tuplesordered dictionariesand more it' described in python' library manualbut its source codelike much in the standard librarycan serve as source of supplemental examples as well pytreea generic tree object viewer this has been command line-oriented to wrap upi want to refer you to program that merges the gui technology we studied earlier in the book with some of the data structure ideas we've explored here this program is called pytree-- generic tree data structure viewer written in python with the tkinter gui library pytree sketches out the nodes of tree on-screen as boxes connected by arrows it also knows how to route mouse clicks on drawn tree nodes back to the treeto trigger tree-specific actions because pytree lets you visualize the structure of the tree generated by set of parametersit' an arguably fun way to explore tree-based algorithms pytree supports arbitrary tree types by "wrappingreal trees in interface objects the interface objects implement standard protocol by communicating with the underlying tree object for the purposes of this pytree is instrumented to display binary search treesfor the next it' also set up to render expression parse trees new trees can be viewed by coding wrapper classes to interface to new tree types data structures |
5,967 | it is written with python and tkinterit should be portable to windowsunixand macs at the topit' used with code of this formroot tk(bwrapper binarytreewrapper(pwrapper parsetreewrapper(viewer treeviewer(bwrapperrootbuild single viewer gui add extrasinput linetest btns make wrapper objects start out in binary mode def onradio()if var get(='btree'viewer settreetype(bwrapperelif var get(='ptree'viewer settreetype(pwrapperchange viewer' wrapper figure - captures the display produced by pytree under python on windows by running its top-level treeview py file with no argumentspytree can also be started from button in the pydemos launcher we met in as usualyou can run it on your own computer to get better feel for its behavior although this screenshot captures one specific kind of treepytree is general enough to display arbitrary tree typesand can even switch them while running figure - pytree viewing binary search tree (test buttonpytreea generic tree object viewer |
5,968 | to save real estate here to study pytreesee the following directoryc:\pp \dstruct\treeview also like pyformthe documentation directory there has the original description of this example that appeared in the third edition of this bookpytree' code is in python formbut the third edition overview may not be as mentionedpytree is set up to display both the binary search trees of this and the expression parse trees of the next when viewing the latterpytree becomes sort of visual calculator--you can generate arbitrary expression trees and evaluate any part of them by clicking on nodes displayed but at this pointthere is not much more can show and/or tell you about those kinds of trees until you move on to data structures |
5,969 | text and language "see jack hack hackjackhackin one form or anotherprocessing text-based information is one of the more common tasks that applications need to perform this can include anything from scanning text file by columns to analyzing statements in language defined by formal grammar such processing usually is called parsing--analyzing the structure of text string in this we'll explore ways to handle language and text-based information and summarize some python development concepts in sidebars along the way in the processwe'll meet string methodstext pattern matchingxml and html parsersand other tools some of this material is advancedbut the examples are small to keep this short for instancerecursive descent parsing is illustrated with simple example to show how it can be implemented in python we'll also see that it' often unnecessary to write custom parsers for each language processing task in python they can usually be replaced by exporting apis for use in python programsand sometimes by single builtin function call finallythis closes by presenting pycalc-- calculator gui written in pythonand the last major python coding example in this text as we'll seewriting calculators isn' much more difficult than juggling stacks while scanning text strategies for processing text in python in the grand scheme of thingsthere are variety of ways to handle text processing and language analysis in pythonexpressions built-in string object expressions methods built-in string object method calls patterns regular expression pattern matching |
5,970 | xml and html text parsing parsersgrammars custom language parsersboth handcoded and generated embedding running python code with eval and exec built-ins and more natural language processing for simpler taskspython' built-in string object is often all we really need python strings can be indexedconcatenatedslicedand processed with both string method calls and built-in functions our main emphasis in this is mostly on higherlevel tools and techniques for analyzing textual information and languagebut we'll briefly explore each of these techniques in turn let' get started some readers may have come to this seeking coverage of unicode texttoobut this topic is not presented here for look at python' unicode supportsee ' discussion of string tools' discussion of text and binary file distinctions and encodingsand ' coverage of text in tkinter guis unicode also appears in various internet and database topics throughout this book ( email encodingsbecause unicode is core language topicall these will also refer you to the fuller coverage of unicode in learning pythonfourth edition most of the topics in this including string methods and pattern matchingapply to unicode automatically simply because the python str string type is unicodewhether ascii or wider string method utilities the first stop on our text and language tour is the most basicpython' string objects come with an array of text processing toolsand serve as your first line of defense in this domain as you undoubtedly know by nowconcatenationslicingformattingand other string expressions are workhorses of most programs ( ' including the newer format method in this categoryas it' really just an alternative to the expression)'spam eggs ham'[ : 'eggs 'spam 'eggs ham'spam eggs ham'spam % % ('eggs''ham''spam eggs ham'spam {{}format('eggs''ham''spam eggs ham'spam "%- "%+ ('ham' text and language slicingsubstring extraction concatenation (and *len()[ix]formatting expressionsubstitution formatting methodalternative more complex formatting |
5,971 | 'spam "{ :< }"{ :+ }format('ham''spam "ham "+ these operations are covered in core language resources such as learning python for the purposes of this thoughwe're interested in more powerful toolspython' string object methods include wide variety of text-processing utilities that go above and beyond string expression operators we saw some of these in action early on in and have been using them ever since for instancegiven an instance str of the built-in string object typeoperations like the following are provided as object method callsstr find(substrperforms substring searches str replace(oldnewperforms substring substitutions str split(delimiterchops up string around delimiter or whitespace str join(iterableputs substrings together with delimiters between str strip(removes leading and trailing whitespace str rstrip(removes trailing whitespace onlyif any str rjust(widthright-justifies string in fixed-width field str upper(converts to uppercase str isupper(tests whether the string is uppercase str isdigit(tests whether the string is all digit characters str endswith(substr-or-tupletests for substring (or tuple of alternativesat the end str startswith(substr-or-tupletests for substring (or tuple of alternativesat the front this list is representative but partialand some of these methods take additional optional arguments for the full list of string methodsrun dir(strcall at the python interactive prompt and run help(str methodon any method for some quick documentation the python library manual and reference books such as python pocket reference also include an exhaustive list string method utilities |
5,972 | strings the latter makes them applicable to arbitrarily encoded unicode textsimply because the str type is unicode texteven if it' only ascii these methods originally appeared as function in the string modulebut are only object methods todaythe string module is still present because it contains predefined constants ( string ascii_uppercase)as well as the template substitution interface in and later--one of the techniques discussed in the next section templating with replacements and formats by way of reviewlet' take quick look at string methods in the context of some of their most common use cases as we saw when generating html forwarding pages in the string replace method is often adequate by itself as string templating tool--we can compute values and insert them at fixed positions in string with simple replacement callstemplate '---$target ---$target ---val 'spamval 'shrubberytemplate template replace('$target 'val template template replace('$target 'val template '---spam---shrubbery---as we also saw when generating html reply pages in the cgi scripts of and the string formatting operator is also powerful templating toolespecially when combined with dictionaries--simply fill out dictionary with values and apply multiple substitutions to the html string all at oncetemplate ""----%(key ) ----%(key ) -""vals {vals['key ''spamvals['key ''shrubberyprint(template valsspamshrubbery--beginning with python the string module' template feature is essentially simplified and limited variation of the dictionary-based format scheme just shownbut it allows some additional call patterns which some may consider simplervals {'key ''shrubbery''key ''spam' text and language |
5,973 | template string template('---$key ---$key ---'template substitute(vals'---spam---shrubbery---template substitute(key ='brian'key ='loretta''---brian---loretta---see the library manual for more on this extension although the string datatype does not itself support the pattern-directed text processing that we'll meet later in this its tools are powerful enough for many tasks parsing with splits and joins in terms of this main focuspython' built-in tools for splitting and joining strings around tokens turn out to be especially useful when it comes to parsing textstr split(delimiter?maxsplits?splits string into list of substringsusing either whitespace (tabsspacesnewlinesor an explicitly passed string as delimiter maxsplits limits the number of splits performedif passed delimiter join(iterableconcatenates sequence or other iterable of substrings ( listtuplegenerator)adding the subject separator string between each these two are among the most powerful of string methods as we saw in split chops string into list of substrings and join puts them back together' dsplit([' '' '' '' '' + + +dsplit('+'[' '' '' '' ''--join([' '' '' ']' -- --cdespite their simplicitythey can handle surprisingly complex text-parsing tasks moreoverstring method calls are very fast because they are implemented in language code for instanceto quickly replace all tabs in file with four periodspipe the file into script that looks like thisfrom sys import stdout write(( join(stdin read(split('\ '))the split call here divides input around tabsand the join puts it back together with periods where tabs had been in this casethe combination of the two calls is equivalent to using the simpler global replacement string method call as followsstdout write(stdin read(replace('\ ' )as we'll see in the next sectionsplitting strings is sufficient for many text-parsing goals string method utilities |
5,974 | let' look next at some practical applications of string splits and joins in many domainsscanning files by columns is fairly common task for instancesuppose you have file containing columns of numbers output by another systemand you need to sum each column' numbers in pythonstring splitting is the core operation behind solving this problemas demonstrated by example - as an added bonusit' easy to make the solution reusable tool in python by packaging it as an importable function example - pp \lang\summer py #!/usr/local/bin/python def summer(numcolsfilename)sums [ numcols for line in open(filename)cols line split(for in range(numcols)sums[ +eval(cols[ ]return sums if __name__ ='__main__'import sys print(summer(eval(sys argv[ ])sys argv[ ])make list of zeros scan file' lines split up columns around blanks/tabs add numbers to sums 'summer py cols filenotice that we use file iterators here to read line by lineinstead of calling the file readlines method explicitly (recall from that iterators avoid loading the entire file into memory all at oncethe file itself is temporary objectwhich will be automatically closed when garbage collected as usual for properly architected scriptsyou can both import this module and call its functionand run it as shell tool from the command line the summer py script calls split to make list of strings representing the line' columnsand eval to convert column strings to numbers here' an input file that uses both blanks and tabs to separate columnsand the result of turning our script loose on itc:\pp \langtype table txt :\pp \langpython summer py table txt [ also notice that because the summer script uses eval to convert file text to numbersyou could really store arbitrary python expressions in the file herefor exampleit' run on file of python code snippetsc:\pp \langtype table txt + << eval(" " * * * pow( , text and language |
5,975 | len('abc'[ , , ][ {'spam': }['spam' :\pp \langpython summer py table txt [ summing with zips and comprehensions we'll revisit eval later in this when we explore expression evaluators sometimes this is more than we want--if we can' be sure that the strings that we run this way won' contain malicious codefor instanceit may be necessary to run them with limited machine access or use more restrictive conversion tools consider the following recoding of the summer function (this is in file summer py in the examples package if you care to experiment with it)def summer(numcolsfilename)sums [ numcols for line in open(filename)cols line split(','nums [int(xfor in colsboth zip(sumsnumssums [ for (xyin bothreturn sums use file iterators assume comma-delimited use limited converter avoid nested for loop xzip is an iterable this version uses int for its conversions from strings to support only numbersand not arbitrary and possibly unsafe expressions although the first four lines of this coding are similar to the originalfor variety this version also assumes the data is separated by commas rather than whitespaceand runs list comprehensions and zip to avoid the nested for loop statement this version is also substantially trickier than the original and so might be less desirable from maintenance perspective if its code is confusingtry adding print call statements after each step to trace the results of each operation here is its handiworkc:\pp \langtype table txt , , , , , , , , , , , , , , , , :\pp \langpython summer py table txt [ summing with dictionaries the summer logic so far worksbut it can be even more general-by making the column numbers key of dictionary rather than an offset in listwe can remove the need to pass in number-columns value altogether besides allowing us to associate meaningful labels with data rather than numeric positionsdictionaries are often more flexible than lists in generalespecially when there isn' fixed size to our problem for instancesuppose you need to sum up columns of data stored in text file where the number of columns is not known or fixedstring method utilities |
5,976 | print(open('table txt'read() herewe cannot preallocate fixed-length list of sums because the number of columns may vary splitting on whitespace extracts the columnsand float converts to numbersbut fixed-size list won' easily accommodate set of sums (at leastnot without extra code to manage its sizedictionaries are more convenient here because we can use column positions as keys instead of using absolute offsets the following code demonstrates interactively (it' also in file summer py in the examples package)sums {for line in open('table txt')cols [float(colfor col in line split()for posval in enumerate(cols)sums[possums get(pos val for key in sorted(sums)print(key'='sums[key] sums { interestinglymost of this code uses tools added to python over the years--file and dictionary iteratorscomprehensionsdict getand the enumerate and sorted built-ins were not yet formed when python was new for related examplesalso see the tkinter grid examples in for another case of eval table magic at work that table sums logic is variation on this themewhich obtains the number of columns from the first line of data file and tailors its summations for display in gui parsing and unparsing rule strings splitting comes in handy for dividing text into columnsbut it can also be used as more general parsing tool--by splitting more than once on different delimiterswe can pick apart more complex text although such parsing can also be achieved with more powerful toolssuch as the regular expressions we'll meet later in this splitbased parsing is simper to code in quick prototypesand may run faster for instanceexample - demonstrates one way that splitting and joining strings can be used to parse sentences in simple language it is taken from rule-based expert system shell (holmesthat is written in python and included in this book' examples distribution (more on holmes in momentrule strings in holmes take the form"rule if then text and language |
5,977 | list of words or variables separated by spacesvariables start with to use ruleit is translated to an internal form-- dictionary with nested lists to display ruleit is translated back to the string form for instancegiven the callrules internal_rule('rule if ?xb then cd ? 'the conversion in function internal_rule proceeds as followsstring 'rule if ?xb then cd ?xi ['rule '' ?xb then cd ? ' [' ?xb''cd ? ' [''' 'result {'rule':' ''if':[[' ','? '][' ']]'then':[[' '][' ','? ']]we first split around the ifthen around the thenand finally around rule the result is the three substrings that were separated by the keywords test and conclusion substrings are split around ",first and spaces last join is used later to convert back (unparseto the original string for display example - is the concrete implementation of this scheme example - pp \lang\rules py def internal_rule(string) string split(if ' [ split(then ' [ split('rule 'return {'rule' [ strip()'if':internal( [ ])'then':internal( [ ])def external_rule(rule)return ('rule rule['rule'if external(rule['if']then external(rule['then']'def internal(conjunct)res [for clause in conjunct split(',')res append(clause split()return res def external(conjunct)strs [for clause in conjunctstrs append(join(clause)return 'join(strs' bc -[' ' '-[[' ',' '][' ',' '][[' ',' '][' ',' ']-[' '' '-' bc dtoday we could use list comprehensions and generator expressions to gain some conciseness here the internal and external functionsfor instancecould be recoded to simply (see file rules py)def internal(conjunct)return [clause split(for clause in conjunct split(',')string method utilities |
5,978 | return 'join(join(clausefor clause in conjunctto produce the desired nested lists and string by combining two steps into one this form might run fasterwe'll leave it to the reader to decide whether it is more difficult to understand as usualwe can test components of this module interactivelyimport rules rules internal(' ?xb'[[' ''? '][' ']rules internal_rule('rule if ?xb then cd ? '{'then'[[' '][' ''? ']]'rule'' ''if'[[' ''? '][' ']] rules internal_rule('rule if ?xb then cd ? 'rules external_rule( 'rule if ?xb then cd ? parsing by splitting strings around tokens like this takes you only so far there is no direct support for recursive nesting of componentsand syntax errors are not handled very gracefully but for simple language tasks like thisstring splitting might be enoughat least for prototyping or experimental systems you can always add more robust rule parser later or reimplement rules as python code or classes more on the holmes expert system shell so how are these rules actually usedas mentionedthe rule parser we just met is part of the python-coded holmes expert system shell holmes is an old system written around and before python it implemented both forward and backward chaining inference over rule sets for examplethe rulerule pylike if ? likes coding? likes spam then ? likes python can be used both to prove whether someone likes python (chaining backwardfrom "thento "if")and to deduce that someone likes python from set of known facts (chaining forwardfrom "ifto "then"deductions may span multiple rulesmultiple clauses represent conjunctionsand rules that name the same conclusion represent alternatives holmes also performs simple pattern-matching along the way to assign the variables that appear in rules ( ? )and it is able to explain both its deductions and questions holmes also served as proof of concept for python in general at time when such evidence was still sometimes necessaryand at last check it still worked unchanged on python platforms howeverits code no longer reflects modern python best practice because of thisi no longer maintain this system today in factit has suffered from bit rot so much and for so long that 've opted not to revisit it for this edition at all its original python code is included in the book examples packagebut it has not been ported to python formand does not accurately reflect python as it exists today that isholmes is an ex-system it has ceased to be and it won' be discussed further here for more modern ai tools and support for pythonsearch the web this is fun text and language |
5,979 | (and souls brave enough to try the portlesson prototype and migrate if you care about performancetry to use the string object' methods rather than things such as regular expressions whenever you can although this is broad rule of thumb and can vary from release to releasesome string methods may be faster because they have less work to do in factwe can learn something from python' own history here today' string object methods began life as python-coded functions in the original string module due to their prevalencethey were later optimized by moving their implementation to the language when you imported stringit internally replaced most of its content with functions imported from the strop extension modulestrop methods were reportedly to , times faster than their python-coded equivalents at the time the result was dramatically faster performance for string client programs without impacting the interface that isstring module clients became instantly faster without having to be modified for the new -based module of coursethese operations evolved further and were finally moved to string object methodstheir only form today but this reflects common pattern in python work similar migration path was applied to the pickle module we met in --the later cpickle recoding in python and _pickle in are compatible but much faster this is great lesson about python developmentmodules can be coded quickly in python at first and translated to later for efficiency if required because the interface to python and extension modules is identical (both are imported modules with callable function attributes) translations of modules are backward compatible with their python prototypes the only impact of the translation of such modules on clients usually is an improvement in performance there is normally no need to move every module to for delivery of an applicationyou can pick and choose performance-critical modules (such as string and picklefor translation and leave others coded in python use the timing and profiling techniques discussed in to isolate which modules will give the most improvement when translated to once you dothe next shows how to go about writing -based extension modules regular expression pattern matching splitting and joining strings is simple way to process textas long as it follows the format you expect for more general text analysis tasks where the structure of your data is not so rigidly definedpython provides regular expression matching utilities especially for the kinds of text associated with domains such as the internet and databases todaythis flexibility can be powerful ally regular expression pattern matching |
5,980 | strings supply pattern and string and ask whether the string matches your pattern after matchparts of the string matched by parts of the pattern are made available to your script that ismatches not only give yes/no answerbut also can pick out substrings as well regular expression pattern strings can be complicated (let' be honest--they can be downright gross to look atbut once you get the hang of themthey can replace larger handcoded string search routines-- single pattern string generally does the work of dozens of lines of manual string scanning code and may run much faster they are concise way to encode the expected structure of text and extract portions of it the re module in pythonregular expressions are not part of the syntax of the python language itselfbut they are supported by the re standard library module that you must import to use the module defines functions for running matches immediatelycompiling pattern strings into pattern objectsmatching these objects against stringsand fetching matched substrings after match it also provides tools for pattern-based splittingreplacingand so on the re module implements rich regular expression pattern syntax that tries to be close to that used to code patterns in the perl language (regular expressions are feature of perl worth emulatingfor instancere supports the notions of named groupscharacter classesand nongreedy matches--regular expression pattern operators that match as few characters as possible (other operators always match the longest possible substringthe re module has also been optimized repeatedlyand in python supports matching for both bytes byte-strings and str unicode strings the net effect is that python' pattern support uses perl-like patternsbut is invoked with different toplevel module interface need to point out up front that regular expressions are complex tools that cannot be covered in depth here if this area sparks your interestthe text mastering regular expressionswritten by jeffrey friedl ( 'reilly)is good next step to take we won' be able to cover pattern construction itself well enough here to turn you into an expert once you learn how to code patternsthoughthe top-level interface for performing matches is straightforward in factthey are so easy to use that we'll jump right into some live examples before getting into more details first examples there are two basic ways to kick off matchesthrough top-level function calls and via methods of precompiled pattern objects the latter precompiled form is quicker if you will be applying the same pattern more than once--to all lines in text filefor instance text and language |
5,981 | re-interactive txt for all the interactive code run in this section)text 'hello spam worldtext 'hello spam otherthe match performed in the following code does not precompileit executes an immediate match to look for all the characters between the words hello and world in our text stringsimport re matchobj re match('hello*)world'text print(matchobjnone when match fails as it does here (the text string doesn' end in world)we get back the none objectwhich is boolean false if tested in an if statement in the pattern string we're using here in the first argument to re matchthe words hello and world match themselvesand *means any character repeated zero or more times (*the fact that it is enclosed in parentheses tells python to save away the part of the string matched by that part of the pattern as group-- matched substring available after the match to see howwe need to make match workmatchobj re match('hello*)world'text print(matchobjmatchobj group( spam when match succeedswe get back match objectwhich has interfaces for extracting matched substrings--the group( call returns the portion of the string matched by the firstleftmostparenthesized portion of the pattern (our *)as mentionedmatching is not just yes/no answerby enclosing parts of the pattern in parenthesesit is also way to extract matched substrings in this casewe've parsed out the text between hello and world group number is the entire string matched by the pattern--useful if you want to be sure your pattern is consuming all the text you think it is the interface for precompiling is similarbut the pattern is implied in the pattern object we get back from the compile callpattobj re compile('hello*)world'matchobj pattobj match(text matchobj group( spam againyou should precompile for speed if you will run the pattern multiple timesand you normally will when scanning files line by line here' something bit more complex that hints at the generality of patterns this one allows for zero or more blanks or tabs at the front (\ ]*)skips one or more after the word hello (\ ]+)captures characters regular expression pattern matching |
5,982 | letter ([ww])as you can seepatterns can handle wide variations in datapatt '\ ]*hello\ ]+*)[ww]orldline hello spamworldmobj re match(pattlinemobj group( 'spamnotice that we matched str pattern to str string in the last listing we can also match bytes to bytes in order to handle data such as encoded textbut we cannot mix the two string types ( constraint which is true in python in general--python wouldn' have the encoding information needed to know how to convert between the raw bytes and the unicode text)patt '\ ]*hello\ ]+*)[ww]orldline bhello spamworldre match(pattlinegroup( 'spamboth as bytes works too and returns bytes groups but cannot mix str/bytes re match(patthello spamworld'typeerrorcan' use bytes pattern on string-like object re match('\ ]*hello\ ]+*)[ww]orld'linetypeerrorcan' use string pattern on bytes-like object in addition to the tools these examples demonstratethere are methods for scanning ahead to find match (search)scanning to find all matches (findall)splitting and replacing on patternsand so on all have analogous module and precompiled call forms the next section turns to few examples to demonstrate more of the basics string operations versus patterns notice how the preceding example skips optional whitespace and allows for uppercase or lowercase letters this underscores why you may want to use patterns in the first place--they support more general kinds of text than string object methods can here' another case in pointwe've seen that string methods can split on and replace substringbut they don' suffice if the delimiter might be more than one alternative'aaa--bbb--cccsplit('--'['aaa''bbb''ccc''aaa--bbb--cccreplace('--'''aaa bbb cccstring methods use fixed strings 'aaa--bbb==cccsplit(['--''==']typeerrorcan' convert 'listobject to str implicitly 'aaa--bbb==cccreplace(['--''==']'typeerrorcan' convert 'listobject to str implicitly patterns can do similar workbut also can handle alternatives directlyby virtue of their pattern matching syntax in the followingthe syntax --|=matches either string -or string ==the syntax [-=matches either the character or ( character set)and the text and language |
5,983 | substring group (split treats groups specially)import re re split('--''aaa--bbb--ccc'['aaa''bbb''ccc're sub('--'''aaa--bbb--ccc''aaa bbb cccsingle string case re split('--|==''aaa--bbb==ccc'['aaa''bbb''ccc're sub('--|=='''aaa--bbb==ccc''aaa bbb cccsplit on -or =replace -or =re split('[-=]''aaa-bbb=ccc'['aaa''bbb''ccc'single char alternative re split('(--)|(==)''aaa--bbb==ccc'['aaa''--'none'bbb'none'==''ccc're split('(?:--)|(?:==)''aaa--bbb==ccc'['aaa''bbb''ccc'split includes groups expr partnot group similarlysplits can extract simple substrings for fixed delimitersbut patterns can also handle surrounding context like bracketsmark parts as optionalignore whitespaceand more in the next tests \smeans zero or more whitespace characters ( character class)\smeans one or more of the same/matches an optional slash[ -zis any lowercase letter ( range);*?means saved substring of zero or more of any character again--but only as many as needed to match the rest of the pattern (nongreedily)and the groups method is used to fetch the substrings matched by the parenthesized subpatterns all at once'spam/ham/eggssplit('/'['spam''ham''eggs're match('*)/*)/*)''spam/ham/eggs'groups(('spam''ham''eggs're match('//''//'groups(('spam''ham''eggs're match('\ */?/?'('spam''ham''eggs'/'groups('hello pattern world!split(['hello''pattern''world!'re match('hello\ *([ - ]*)\ +*?)\ *!''hellopattern ('pattern''world'world !'groups(in factthere' more than one way to match the findall method provides generality that leaves string objects in the dust--it locates all occurrences of pattern and returns all the substrings it matched (or list of tuples for multiple groupsthe search method is similar but stops at the first match--it' like match plus an initial forward scan in the followingstring object finds locate just one specific stringbut patterns can be used to regular expression pattern matching |
5,984 | between'//find('ham'find substring offset re findall('''//'find all matches/groups ['spam''ham''eggs're findall('''['spam''ham''eggs're findall('/?'''[('spam''ham')('eggs''cheese')re search('/?''todays menu'groups(('spam''ham'especially when using findallthe (?soperator comes in handy to force to match end-of-line characters in multiline textwithout it matches everything except lines ends the following searches look for two adjacent bracketed strings with arbitrary text betweenwith and without skipping line breaksre findall(*'\ \ '[re findall('(? *'\ \ '[('spam''eggs')re findall('(? *?'\ \ '[('spam''ham')stop at \ greedy nongreedy to make larger patterns more mnemonicwe can even associate names with matched substring groups in using the pattern syntax and fetch them by name after matchesthough this is of limited utility for findall the next tests look for strings of "wordcharacters (\wseparated by /--this isn' much more than string splitbut parts are namedand search and findall both scan aheadre search('(? \ *)/(? \ *)'aaa/bbb/ccc]'groups(('aaa''bbb're search('(? \ *)/(? \ *)'aaa/bbb/ccc]'groupdict({'part ''aaa''part ''bbb're search('(? \ *)/(? \ *)'aaa/bbb/ccc]'group( 'bbbre search('(? \ *)/(? \ *)'aaa/bbb/ccc]'group('part ''bbbre findall('(? \ *)/(? \ *)'aaa/bbb ccc/ddd]'[('aaa''bbb')('ccc''ddd')finallyalthough basic string operations such as slicing and splits are sometimes enoughpatterns are much more flexible the following uses [to match any character not following the ^and escapes dash within [alternative set using \so it' not taken to be character set range separator it runs equivalent slicessplitsand matchesalong with more general match that the other two cannot approachline 'aaa bbb cccline[: ]line[ : ]line[ : text and language slice data at fixed offsets |
5,985 | line split(['aaa''bbb''ccc're split(+'line['aaa''bbb''ccc're findall('[]+'line['aaa''bbb''ccc'line 'aaa bbb-ccc ddd -/ & *ere findall('[\-/]+'line['aaa''bbb''ccc''ddd'' & * 'split data with fixed delimiters split on general delimiters find non-delimiter runs handle generalized text at this pointif you've never used pattern syntax in the past your head may very well be spinning (or have blown off entirely!before we go into any further exampleslet' dig into few of the details underlying the re module and its patterns using the re module the python re module comes with functions that can search for patterns right away or make compiled pattern objects for running matches later pattern objects (and module search callsin turn generate match objectswhich contain information about successful matches and matched substrings for referencethe next few sections describe the module' interfaces and some of the operators you can use to code patterns module functions the top level of the module provides functions for matchingsubstitutionprecompilingand so oncompile(pattern [flags]compile regular expression pattern string into regular expression pattern objectfor later matching see the reference manual or python pocket reference for the flags argument' meaning match(patternstring [flags]if zero or more characters at the start of string match the pattern stringreturn corresponding match objector none if no match is found roughly like search for pattern that begins with the operator search(patternstring [flags]scan through string for location matching patternand return corresponding match objector none if no match is found findall(patternstring [flags]return list of strings giving all nonoverlapping matches of pattern in string if there are any groups in patternsreturns list of groupsand list of tuples if the pattern has more than one group finditer(patternstring [flags]return iterator over all nonoverlapping matches of pattern in string regular expression pattern matching |
5,986 | split string by occurrences of pattern if capturing parentheses (()are used in the patternthe text of all groups in the pattern are also returned in the resulting list sub(patternreplstring [countflags]return the string obtained by replacing the (first countleftmost nonoverlapping occurrences of pattern ( string or pattern objectin string by repl (which may be string with backslash escapes that may back-reference matched groupor function that is passed single match object and returns the replacement stringsubn(patternreplstring [countflags]same as subbut returns tuple(new-stringnumber-of-substitutions-madeescape(stringreturn string with all nonalphanumeric characters backslashedsuch that they can be compiled as string literal compiled pattern objects at the next levelpattern objects provide similar attributesbut the pattern string is implied the re compile function in the previous section is useful to optimize patterns that may be matched more than once (compiled patterns match fasterpattern objects returned by re compile have these sorts of attributesmatch(string [pos[endpos]search(string [pos[endpos]findall(string [pos [endpos]]finditer(string [pos [endpos]]split(string [maxsplit]sub(replstring [count]subn(replstring [count]these are the same as the re module functionsbut the pattern is impliedand pos and endpos give start/end string indexes for the match match objects finallywhen match or search function or method is successfulyou get back match object (none comes back on failed matchesmatch objects export set of attributes of their ownincludinggroup(ggroup( return the substring that matched parenthesized group (or groupsin the pattern accept group numbers or names group numbers start at group is the entire string matched by the pattern returns tuple when passed multiple group numbersand group number defaults to if omitted text and language |
5,987 | returns tuple of all groupssubstrings of the match (for group numbers and highergroupdict(returns dictionary containing all named groups of the match (see (?prsyntax aheadstart([group]end([group]indices of the start and end of the substring matched by group (or the entire matched stringif no group is passedspan([group]returns the two-item tuple(start(group)end(group)expand([template]performs backslash group substitutionssee the python library manual regular expression patterns regular expression strings are built up by concatenating single-character regular expression formsshown in table - the longest-matching string is usually matched by each formexcept for the nongreedy operators in the tabler means any regular expression formc is characterand denotes digit table - re pattern syntax operator interpretation matches any character (including newline if dotall flag is specified or (?sat pattern frontmatches start of the string (of every line in multiline modematches end of the string (of every line in multiline modec any nonspecial (or backslash-escapedcharacter matches itself rzero or more of preceding regular expression (as many as possiblerone or more of preceding regular expression (as many as possiblerzero or one occurrence of preceding regular expression (optionalr{mmatches exactly copies preceding ra{ matches 'aaaaar{ ,nmatches from to repetitions of preceding regular expression *? +? ?? { , }same as *+and but matches as few characters/times as possiblethese are known as nongreedy match operators (unlike othersthey match and consume as few characters as possibledefines character sete [ -za-zto match all letters (alternativeswith for ranges[defines complemented character setmatches if char is not in set escapes special chars ( *?+|()and introduces special sequences in table - \matches literal (write as \\\in patternor use '\\'\ matches the contents of the group of the same number '+\ matches " regular expression pattern matching |
5,988 | interpretation | alternativematches left or right rr concatenationmatch both rs (rmatches any regular expression inside ()and delimits group (retains matched substring(?:rsame as (rbut simply delimits part and does not denote saved group (?=rlook-ahead assertionmatches if matches nextbut doesn' consume any of the string ( (?=ymatches only if followed by (?!rmatches if doesn' match nextnegative of (?= (?prmatches any regular expression inside ()and delimits named group (? =namematches whatever text was matched by the earlier group named name (? commentignored (?letterset mode flagletter is one of ailmsux (see the library manual(?<=rlook-behind assertionmatches if the current position in the string is preceded by match of that ends at the current position (?<!rmatches if the current position in the string is not preceded by match for rnegative of (?< (?(idname)yespatternnopatternwill try to match with yespattern if the group with given id or name existselse with optional nopattern within patternsranges and selections can be combined for instance[ -za- - ]matches the longest possible string of one or more lettersdigitsor underscores special characters can be escaped as usual in python strings[\ ]matches zero or more tabs and spaces ( it skips such whitespacethe parenthesized grouping construct( )lets you extract matched substrings after successful match the portion of the string matched by the expression in parentheses is retained in numbered register it' available through the group method of match object after successful match in addition to the entries in this tablespecial sequences in table - can be used in patternstoo because of python string rulesyou sometimes must double up on backslashes (\\or use python raw strings ( 'to retain backslashes in the pattern verbatim python ignores backslashes in normal strings if the letter following the backslash is not recognized as an escape code some of the entries in table - are affected by unicode when matching str instead of bytesand an ascii flag may be set to emulate the behavior for bytessee python' manuals for more details text and language |
5,989 | sequence interpretation \number matches text of group number (numbered from \ matches only at the start of the string \ empty string at word boundaries \ empty string not at word boundaries \ any decimal digit character ([ - for ascii\ any nondecimal digit character ([^ - for ascii\ any whitespace character (\ \ \ \ \vfor ascii\ any nonwhitespace character ([\ \ \ \ \vfor ascii\ any alphanumeric character ([ -za- - _for ascii\ any nonalphanumeric character ([^ -za- - _for ascii \ matches only at the end of the string most of the standard escapes supported by python string literals are also accepted by the regular expression parser\ \ \ \ \ \ \ \xand \the python library manual gives these escapesinterpretation and additional details on pattern syntax in general but to further demonstrate how the re pattern syntax is typically used in scriptslet' go back to writing some code more pattern examples for more contextthe next few examples present short test files that match simple but representative pattern forms comments in example - describe the operations exercisedcheck table - to see which operators are used in these patterns if they are still confusingtry running these tests interactivelyand call group( instead of start(to see which strings are being matched by the patterns example - pp \lang\re-basics py ""literalssetsrangesalternativesand escapes all tests here print offset where pattern found ""import re the one to use today patternstring " ""xxabcdxxmatchobj re search(patternstringif matchobjprint(matchobj start()nonspecial chars match themselves means any one char search returns match object or none start is index where matched pattobj re compile(" * *"matchobj pattobj search("xxabcdxx"if matchobj' *means zero or more rs compile returns pattern obj patt search returns match obj regular expression pattern matching |
5,990 | selection sets print(re search(* [de][ - ][^ -ze] \ ?"abcdefg\ "start()alternativesr | means or print(re search("( | )( | )( | ) "aycd "start()test each char print(re search("(?: | )(?: | )(?: | ) "aycd "start()samenot saved print(re search(" |xb|yc|zd"aycd "start()matches just aprint(re search("( |xb|yc|zd)ycd"aycd "start()just first char word boundaries print(re search( "\babcd"abcd "start()print(re search( "abcd\ "abcd "start()\ means word boundary use rto escape '\notice again that there are different ways to kick off match with reby calling module search functions and by making compiled pattern objects in either eventyou can hang on to the resulting match object or not all the print call statements in this script show result of --the offset where the pattern was found in the string in the first testfor examplea matches the abcd at offset in the search string ( after the first xx) :\pp \langpython re-basic py more omitted nextin example - parts of the pattern strings enclosed in parentheses delimit groupsthe parts of the string they matched are available after the match example - pp \lang\re-groups py ""groupsextract substrings matched by res in '()parts groups are denoted by positionbut (?prcan also name them ""import re patt re compile(" ) ) )"saves substrings mobj patt match(" "each '()is group print(mobj group( )mobj group( )mobj group( )group(gives substring patt re compile(" *) *) *)"mobj patt match(" "print(mobj groups()saves substrings groups(gives all groups print(re search("( | )( | )( | ) "aycd "groups()print(re search("(?pa| )(?pb| )(?pc| ) "aycd "groupdict()patt re compile( "[\ ]*#\ *define\ *([ - - ]*)\ **)"mobj patt search(define spam "parts of #define print(mobj groups()\ is whitespace in the first test herefor instancethe three groups each match single characterbut they retain the character matchedcalling group pulls out the character matched text and language |
5,991 | and fourth tests shows how alternatives can be grouped by both position and nameand the last test matches #define lines--more on this pattern in momentc:\pp \langpython re-groups py (' '' '' '(' '' '' '{' '' '' '' '' '' '('spam'' 'finallybesides matches and substring extractionre also includes tools for string replacement or substitution (see example - example - pp \lang\re-subst py "substitutionsreplace occurrences of pattern in stringimport re print(re sub('[abc]''*''xaxaxbxbxcxc')print(re sub('[abc] ''*''xa-xa_xb-xb_xc-xc_')alternatives char print(re sub('spam''spam\\ 'group back ref (or ''' spamy spam')def mapper(matchobj)return 'spammatchobj group( print(re sub('spam'mapper' spamy spam')mapping function in the first testall characters in the set are replacedin the secondthey must be followed by an underscore the last two tests illustrate more advanced group back-references and mapping functions in the replacement note the \\ required to escape \ for python' string rulesr'spam\ would work just as well see also the earlier interactive tests in the section for additional substitution and splitting examplesc:\pp \langpython re-subst py * * * * *xxa- *xb- *xc-xspamxspamy spamxspamy scanning header files for patterns to wrap uplet' turn to more realistic examplethe script in example - puts these pattern operators to more practical use it uses regular expressions to find #define and #include lines in header files and extract their components the generality of the patterns makes them detect variety of line formatspattern groups (the parts in parenthesesare used to extract matched substrings from line after match regular expression pattern matching |
5,992 | "scan header files to extract parts of #define and #include linesimport sysre pattdefine re compile'^#[\ ]*define[\ ]+(\ +)[\ ]**)'pattinclude re compile'^#[\ ]*include[\ ]+[<"]([\ /]+)'compile to pattobj "define xxx yyy \ like [ -za- - "include def scan(fileobj)count for line in fileobjscan by linesiterator count + matchobj pattdefine match(linenone if match fails if matchobjname matchobj group( substrings for parts body matchobj group( print(count'defined'name'='body strip()continue matchobj pattinclude match(lineif matchobjstartstop matchobj span( start/stop indexes of filename line[start:stopslice out of line print(count'include'filenamesame as matchobj group( if len(sys argv= scan(sys stdinelsescan(open(sys argv[ ]' ')no argsread stdin arginput filename to testlet' run this script on the text file in example - example - pp \lang\test #ifndef test_h #define test_h #include #include include "python #define debug #define hello 'hello regex worlddefine spam #define eggs sunny side up #define adder(arg arg #endif notice the spaces after in some of these linesregular expressions are flexible enough to account for such departures from the norm here is the script at workpicking out #include and #define lines and their parts for each matched lineit prints the line numberthe line typeand any matched substrings text and language |
5,993 | defined test_h include stdio include lib/spam include python defined debug defined hello 'hello regex world defined spam defined eggs sunny side up defined adder (arg arg for an additional example of regular expressions at worksee the file pygrep py in the book examples packageit implements simple pattern-based "grepfile search utilitybut was cut here for space as we'll seewe can also sometimes use regular expressions to parse information from xml and html text--the topics of the next section xml and html parsing beyond string objects and regular expressionspython ships with support for parsing some specific and commonly used types of formatted text in particularit provides precoded parsers for xml and html which we can deploy and customize for our text processing goals in the xml departmentpython includes parsing support in its standard library and plays host to prolific xml special-interest group xml (for extensible markup languageis tag-based markup language for describing many kinds of structured data among other thingsit has been adopted in roles such as standard database and internet content representation in many contexts as an object-oriented scripting languagepython mixes remarkably well with xml' core notion of structured document interchange xml is based upon tag syntax familiar to web page writersused to describe and package data the xml module package in python' standard library includes tools for parsing this data from xml textwith both the sax and the dom standard parsing modelsas well as the python-specific elementtree package although regular expressions can sometimes extract information from xml documentstoothey can be easily misled by unexpected textand don' directly support the notion of arbitrarily nested xml constructs (more on this limitation later when we explore languages in generalin shortsax parsers provide subclass with methods called during the parsing operationand dom parsers are given access to an object tree representing the (usuallyalready parsed document sax parsers are essentially state machines and must record (and possibly stackpage details as the parse progressesdom parsers walk object trees using loopsattributesand methods defined by the dom standard elementtree is roughly python-specific analog of domand as such can often yield simpler codeit can also be used to generate xml text from their object-based representations xml and html parsing |
5,994 | client and server sides of the xml-rpc protocol (remote procedure calls that transmit objects encoded as xml over http)as well as standard html parserhtml parserthat works on similar principles and is presented later in this the third-party domain has even more xml-related toolsmost of these are maintained separately from python to allow for more flexible release schedules beginning with python the expat parser is also included as the underlying engine that drives the parsing process xml parsing in action xml processing is largeevolving topicand it is mostly beyond the scope of this book for an example of simple xml parsing taskthoughconsider the xml file in example - this file defines handful of 'reilly python books--isbn numbers as attributesand titlespublication datesand authors as nested tags (with apologies to python books not listed in this completely random sample--there are many!example - pp \lang\xml\books xml python &xml december jonesdrake programming python th edition october lutz learning python th edition september lutz python pocket reference th edition october lutz python cookbook nd edition march martelliravenscroftascher python in nutshell nd edition july martelli text and language |
5,995 | titles by exampleusing each of the four primary python tools at our disposal--patternssaxdomand elementtree regular expression parsing in some contextsthe regular expressions we met earlier can be used to parse information from xml files they are not complete parsersand are not very robust or accurate in the presence of arbitrary text (text in tag attributes can especially throw them offwhere applicablethoughthey offer simple option example - shows how we might go about parsing the xml file in example - with the prior section' re module like all four examples in this sectionit scans the xml file looking at isbn numbers and associated titlesand stores the two as keys and values in python dictionary example - pp \lang\xml\rebook py ""xml parsingregular expressions (no robust or general""import repprint text open('books xml'read(pattern '(? )isbn="*?)*?*?)found re findall(patterntextmapping {isbntitle for (isbntitlein foundpprint pprint(mappingstr if str pattern *?=nongreedy (? )=dot matches / dict from tuple list when runthe re findall method locates all the nested tags we're interested inextracts their contentand returns list of tuples representing the two parenthesized groups in the pattern python' pprint module displays the dictionary created by the comprehension nicely the extract worksbut only as long as the text doesn' deviate from the expected pattern in ways that would invalidate our script moreoverthe xml entity for "&in the first book' title is not un-escaped automaticallyc:\pp \lang\xmlpython rebook py {'- ''python &xml''- ''python cookbook nd edition''- ''python in nutshell nd edition''- ''learning python th edition''- ''python pocket reference th edition''- ''programming python th edition'sax parsing to do betterpython' full-blown xml parsing tools let us perform this data extraction in more accurate and robust way example - for instancedefines sax-based parsing procedureits class implements callback methods that will be called during the parseand its top-level code creates and runs parser xml and html parsing |
5,996 | ""xml parsingsax is callback-based api for intercepting parser events ""import xml saxxml sax handlerpprint class bookhandler(xml sax handler contenthandler)def __init__(self)self intitle false self mapping {def startelement(selfnameattributes)if name ="book"self buffer "self isbn attributes["isbn"elif name ="title"self intitle true handle xml parser events state machine model on start book tag save isbn for dict key on start title tag save title text to follow def characters(selfdata)if self intitleself buffer +data on text within tag save text if in title def endelement(selfname)if name ="title"self intitle false self mapping[self isbnself buffer on end title tag store title text in dict parser xml sax make_parser(handler bookhandler(parser setcontenthandler(handlerparser parse('books xml'pprint pprint(handler mappingthe sax model is efficientbut it is potentially confusing at first glancebecause the class must keep track of where the parse currently is using state information for examplewhen the title tag is first detectedwe set state flag and initialize bufferas each character within the title tag is parsedwe append it to the buffer until the ending portion of the title tag is encountered the net effect saves the title tag' content as string this model is simplebut can be complex to managein cases of potentially arbitrary nestingfor instancestate information may need to be stacked as the class receives callbacks for nested tags to kick off the parsewe make parser objectset its handler to an instance of our classand start the parseas python scans the xml fileour class' methods are called automatically as components are encountered when the parse is completewe use the python pprint module to display the result again--the mapping dictionary object attached to our handler the result is the mostly the same this timebut notice that the "&escape sequence is properly un-escaped now--sax performs xml parsingnot text matching text and language |
5,997 | {'- ''python xml''- ''python cookbook nd edition''- ''python in nutshell nd edition''- ''learning python th edition''- ''python pocket reference th edition''- ''programming python th edition'dom parsing the dom parsing model for xml is perhaps simpler to understand--we simply traverse tree of objects after the parse--but it might be less efficient for large documentsif the document is parsed all at once ahead of time and stored in memory dom also supports random access to document parts via tree fetchesnested loops for known structuresand recursive traversals for arbitrary nestingin saxwe are limited to single linear parse example - is dom-based equivalent to the sax parser of the preceding section example - pp \lang\xml\dombook py ""xml parsingdom gives whole document to the application as traversable object ""import pprint import xml dom minidom from xml dom minidom import node doc xml dom minidom parse("books xml"mapping {for node in doc getelementsbytagname("book")isbn node getattribute("isbn" node getelementsbytagname("title"for node in ltitle "for node in node childnodesif node nodetype =node text_nodetitle +node data mapping[isbntitle load doc into object usually parsed up front traverse dom object via dom object api mapping now has the same value as in the sax example pprint pprint(mappingthe output of this script is the same as what we generated interactively for the sax parserherethoughit is built up by walking the document object tree after the parse has finished using method calls and attributes defined by the cross-language dom standard specification this is both strength and potential weakness of dom--its api is language neutralbut it may seem bit nonintuitive and verbose to some python programmers accustomed to simpler modelsc:\pp \lang\xmlpython dombook py {'- ''python xml'xml and html parsing |
5,998 | '- ''python in nutshell nd edition''- ''learning python th edition''- ''python pocket reference th edition''- ''programming python th edition'elementtree parsing as fourth optionthe popular elementtree package is standard library tool for both parsing and generating xml as parserit' essentially more pythonic type of dom--it parses documents into tree of objects againbut the api for navigating the tree is more lightweightbecause it' python-specific elementtree provides easy-to-use tools for parsingchangingand generating xml documents for both parsing and generatingit represents documents as tree of python "elementobjects each element in the tree has tag nameattribute dictionarytext valueand sequence of child elements the element object produced by parse can be navigating with normal python loops for known structuresand with recursion where arbitrary nesting is possible the elementtree system began its life as third-party extensionbut it was largely incorporated into python' standard library as the package xml etree example - shows how to use it to parse our book catalog file one last time example - pp \lang\xml\etreebook py ""xml parsingelementtree (etreeprovides python-based api for parsing/generating ""import pprint from xml etree elementtree import parse mapping {tree parse('books xml'for in tree findall('book')isbn attrib['isbn'for in findall('title')mapping[isbnt text pprint pprint(mappingwhen run we get the exact same results as for sax and dom againbut the code required to extract the file' details seems noticeably simpler this time aroundc:\pp \lang\xmlpython etreebook py {'- ''python xml''- ''python cookbook nd edition''- ''python in nutshell nd edition''- ''learning python th edition''- ''python pocket reference th edition''- ''programming python th edition' text and language |
5,999 | naturallythere is much more to python' xml support than these simple examples imply in deference to spacethoughhere are pointers to xml resources in lieu of additional examplesstandard library firstbe sure to consult the python library manual for more on the standard library' xml support tools see the entries for rexml sax xml domand xml etree for more on this section' examples pyxml sig tools you can also find python xml tools and documentation at the xml special interest group (sigweb page at wedding xml technologies with pythonand it publishes free xml tools independent of python itself much of the standard library' xml support originated with this group' work third-party tools you can also find freethird-party python support tools for xml on the web by following links at the xml sigs web page of special interestthe suite open source package provides integrated tools for xml processingincluding open technologies such as domsaxrdfxsltxincludexpointerxlinkand xpath documentation variety of books have been published which specifically address xml and text processing in python 'reilly offers book dedicated to the subject of xml processing in pythonpython xmlwritten by christopher jones and fred drakejr as usualbe sure to also see your favorite web search engine for more recent developments on this front html parsing in action although more limited in scopepython' html parser standard library module also supports html-specific parsinguseful in "screen scrapingroles to extract information from web pages among other thingsthis parser can be used to process web replies fetched with the urllib request module we met in the internet part of this bookto extract plain text from html email messagesand more the html parser module has an api reminiscent of the xml sax model of the prior sectionit provides parser which we subclass to intercept tags and their data during parse unlike saxwe don' provide handler classbut extend the parser class directly here' quick interactive example to demonstrate the basics ( copied all of this section' code into file htmlparser py in the examples package if you wish to experiment with it yourself)xml and html parsing |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.