Let's see how can we capitalize first letter of a column in Pandas dataframe . Method 5: string.capwords() to Capitalize first letter of every word in Python: Method 6: Capitalize the first letter of every word in the list in Python: Method 7:Capitalize first letter of every word in a file in Python, How to Convert String to Lowercase in Python, How to use Python find() | Python find() String Method, Python Pass Statement| What Does Pass Do In Python, cPickle in Python Explained With Examples. !"; str.capitalize() So the output will be Example 1: Python capitalize . python,python,string,python-3.x,capitalization,Python,String,Python 3.x,Capitalization,.capitalize "IBM""SIM" by passing first argument as negative value as shown below, Last 2 characters from right is extracted using substring function so the resultant dataframe will be, Extract characters from string column in pyspark is obtained using substr() function. Python center align the string using a specified character. Sample example using selectExpr to get sub string of column(date) as year,month,day. 2. Hyderabad, Telangana, India. In order to extract the first n characters with the substr command, we needed to specify three values within the function: The character string (in our case x). Get Substring of the column in Pyspark - substr(), Substring in sas - extract first n & last n character, Extract substring of the column in R dataframe, Extract first n characters from left of column in pandas, Left and Right pad of column in pyspark lpad() & rpad(), Tutorial on Excel Trigonometric Functions, Add Leading and Trailing space of column in pyspark add space, Remove Leading, Trailing and all space of column in pyspark strip & trim space, Typecast string to date and date to string in Pyspark, Typecast Integer to string and String to integer in Pyspark, Add leading zeros to the column in pyspark, Convert to upper case, lower case and title case in pyspark, Extract First N characters in pyspark First N character from left, Extract Last N characters in pyspark Last N character from right, Extract characters from string column of the dataframe in pyspark using. All the 4 functions take column type argument. Here is an example: You can use a workaround by splitting the first letter and the rest, make the first letter uppercase and lowercase the rest, then concatenate them back, or you can use a UDF if you want to stick using Python's .capitalize(). What Is PySpark? . New in version 1.5.0. Asking for help, clarification, or responding to other answers. In Pyspark we can get substring() of a column using select. Would the reflected sun's radiation melt ice in LEO? Usually you don't capitalize after a colon, but there are exceptions. In this section we will see an example on how to extract First N character from left in pyspark and how to extract last N character from right in pyspark. May 2016 - Oct 20166 months. Excel should add an opening parenthesis ( after the word Mid and show a tooltip in which the word MID is a hyperlink: The tooltip shows the arguments of the function (here: text, start_num and num_chars). The first character is converted to upper case, and the rest are converted to lower case: See what happens if the first character is a number: Get certifiedby completinga course today! Let's see an example for both. pandas frequency count multiple columns | February 26 / 2023 | alastair atchison pilotalastair atchison pilot The consent submitted will only be used for data processing originating from this website. You probably know you should capitalize proper nouns and the first word of every sentence. Pyspark string function str.upper() helps in creating Upper case texts in Pyspark. Split Strings into words with multiple word boundary delimiters. This allows you to access the first letter of every word in the string, including the spaces between words. Here date is in the form year month day. The capitalize() method converts the first character of a string to an uppercase letter and other characters to lowercase. How do you capitalize just the first letter in PySpark for a dataset? The current implementation puts the partition ID in the upper 31 bits, and the record number within each partition in the lower 33 bits. Capitalize the first letter, lower case the rest. Convert column to upper case in pyspark - upper . functions. Use a Formula to Capitalize the First Letter of the First Word. In this article, we are going to get the extract first N rows and Last N rows from the dataframe using PySpark in Python. If no valid global default SparkSession exists, the method creates a new . First line not capitalizing correctly in Python 3. Add left pad of the column in pyspark. The output is already shown as images. Capitalize first letter of a column in Pandas dataframe - A pandas dataframe is similar to a table with rows and columns. Sometimes we may have a need of capitalizing the first letters of one column in the dataframe which can be achieved by the following methods.Creating a DataframeIn the below example we first create a dataframe with column names as Day a slice (1);} //capitalize all words of a string. In this tutorial, I have explained with an example of getting substring of a column using substring() from pyspark.sql.functions and using substr() from pyspark.sql.Column type.if(typeof ez_ad_units != 'undefined'){ez_ad_units.push([[300,250],'sparkbyexamples_com-box-3','ezslot_4',105,'0','0'])};__ez_fad_position('div-gpt-ad-sparkbyexamples_com-box-3-0'); Using the substring() function of pyspark.sql.functions module we can extract a substring or slice of a string from the DataFrame column by providing the position and length of the string you wanted to slice. The objective is to create a column with all letters as upper case, to achieve this Pyspark has upper function. Lets see an example of each. The title function in python is the Python String Method which is used to convert the first character in each word to Uppercase and the remaining characters to Lowercase in the string . However, if you have any doubts or questions, do let me know in the comment section below. Capitalize() Function in python is used to capitalize the First character of the string or first character of the column in dataframe. PySpark Split Column into multiple columns. The First Letter in the string capital in Python For this purpose, we have a built-in function named capitalize () 1 2 3 string="hello how are you" uppercase_string=string.capitalize () print(uppercase_string) Here, we will read data from a file and capitalize the first letter of every word and update data into the file. Then we iterate through the file using a loop. Python xxxxxxxxxx for col in df_employee.columns: df_employee = df_employee.withColumnRenamed(col, col.lower()) #print column names df_employee.printSchema() root |-- emp_id: string (nullable = true) https://spark.apache.org/docs/2.0.1/api/python/_modules/pyspark/sql/functions.html. upper() Function takes up the column name as argument and converts the column to upper case. Let us perform tasks to understand the behavior of case conversion functions and length. I hope you liked it! rev2023.3.1.43269. It will return the first non-null value it sees when ignoreNulls is set to true. Theoretically Correct vs Practical Notation. Access the last element using indexing. Is there a way to easily capitalize these fields? def monotonically_increasing_id (): """A column that generates monotonically increasing 64-bit integers. To view the purposes they believe they have legitimate interest for, or to object to this data processing use the vendor list link below. At what point of what we watch as the MCU movies the branching started? The column to perform the uppercase operation on. Upper case the first letter in this sentence: The capitalize() method returns a string The field is in Proper case. Capitalize Word We can use "initCap" function to capitalize word in string. pyspark.sql.DataFrame A distributed collection of data grouped into named columns. We then used the upper() method to convert it into uppercase. Connect and share knowledge within a single location that is structured and easy to search. 2) Using string slicing() and upper() method. While exploring the data or making new features out of it you might encounter a need to capitalize the first letter of the string in a column. Clicking the hyperlink should open the Help pane with information about the . map() + series.str.capitalize() map() Map values of Series according to input correspondence. To view the purposes they believe they have legitimate interest for, or to object to this data processing use the vendor list link below. sql. Below is the code that gives same output as above.if(typeof ez_ad_units != 'undefined'){ez_ad_units.push([[468,60],'sparkbyexamples_com-box-4','ezslot_5',139,'0','0'])};__ez_fad_position('div-gpt-ad-sparkbyexamples_com-box-4-0'); Below is the example of getting substring using substr() function from pyspark.sql.Column type in Pyspark. pyspark.sql.SparkSession.builder.enableHiveSupport, pyspark.sql.SparkSession.builder.getOrCreate, pyspark.sql.SparkSession.getActiveSession, pyspark.sql.DataFrame.createGlobalTempView, pyspark.sql.DataFrame.createOrReplaceGlobalTempView, pyspark.sql.DataFrame.createOrReplaceTempView, pyspark.sql.DataFrame.sortWithinPartitions, pyspark.sql.DataFrameStatFunctions.approxQuantile, pyspark.sql.DataFrameStatFunctions.crosstab, pyspark.sql.DataFrameStatFunctions.freqItems, pyspark.sql.DataFrameStatFunctions.sampleBy, pyspark.sql.functions.approxCountDistinct, pyspark.sql.functions.approx_count_distinct, pyspark.sql.functions.monotonically_increasing_id, pyspark.sql.PandasCogroupedOps.applyInPandas, pyspark.pandas.Series.is_monotonic_increasing, pyspark.pandas.Series.is_monotonic_decreasing, pyspark.pandas.Series.dt.is_quarter_start, pyspark.pandas.Series.cat.rename_categories, pyspark.pandas.Series.cat.reorder_categories, pyspark.pandas.Series.cat.remove_categories, pyspark.pandas.Series.cat.remove_unused_categories, pyspark.pandas.Series.pandas_on_spark.transform_batch, pyspark.pandas.DataFrame.first_valid_index, pyspark.pandas.DataFrame.last_valid_index, pyspark.pandas.DataFrame.spark.to_spark_io, pyspark.pandas.DataFrame.spark.repartition, pyspark.pandas.DataFrame.pandas_on_spark.apply_batch, pyspark.pandas.DataFrame.pandas_on_spark.transform_batch, pyspark.pandas.Index.is_monotonic_increasing, pyspark.pandas.Index.is_monotonic_decreasing, pyspark.pandas.Index.symmetric_difference, pyspark.pandas.CategoricalIndex.categories, pyspark.pandas.CategoricalIndex.rename_categories, pyspark.pandas.CategoricalIndex.reorder_categories, pyspark.pandas.CategoricalIndex.add_categories, pyspark.pandas.CategoricalIndex.remove_categories, pyspark.pandas.CategoricalIndex.remove_unused_categories, pyspark.pandas.CategoricalIndex.set_categories, pyspark.pandas.CategoricalIndex.as_ordered, pyspark.pandas.CategoricalIndex.as_unordered, pyspark.pandas.MultiIndex.symmetric_difference, pyspark.pandas.MultiIndex.spark.data_type, pyspark.pandas.MultiIndex.spark.transform, pyspark.pandas.DatetimeIndex.is_month_start, pyspark.pandas.DatetimeIndex.is_month_end, pyspark.pandas.DatetimeIndex.is_quarter_start, pyspark.pandas.DatetimeIndex.is_quarter_end, pyspark.pandas.DatetimeIndex.is_year_start, pyspark.pandas.DatetimeIndex.is_leap_year, pyspark.pandas.DatetimeIndex.days_in_month, pyspark.pandas.DatetimeIndex.indexer_between_time, pyspark.pandas.DatetimeIndex.indexer_at_time, pyspark.pandas.groupby.DataFrameGroupBy.agg, pyspark.pandas.groupby.DataFrameGroupBy.aggregate, pyspark.pandas.groupby.DataFrameGroupBy.describe, pyspark.pandas.groupby.SeriesGroupBy.nsmallest, pyspark.pandas.groupby.SeriesGroupBy.nlargest, pyspark.pandas.groupby.SeriesGroupBy.value_counts, pyspark.pandas.groupby.SeriesGroupBy.unique, pyspark.pandas.extensions.register_dataframe_accessor, pyspark.pandas.extensions.register_series_accessor, pyspark.pandas.extensions.register_index_accessor, pyspark.sql.streaming.ForeachBatchFunction, pyspark.sql.streaming.StreamingQueryException, pyspark.sql.streaming.StreamingQueryManager, pyspark.sql.streaming.DataStreamReader.csv, pyspark.sql.streaming.DataStreamReader.format, pyspark.sql.streaming.DataStreamReader.json, pyspark.sql.streaming.DataStreamReader.load, pyspark.sql.streaming.DataStreamReader.option, pyspark.sql.streaming.DataStreamReader.options, pyspark.sql.streaming.DataStreamReader.orc, pyspark.sql.streaming.DataStreamReader.parquet, pyspark.sql.streaming.DataStreamReader.schema, pyspark.sql.streaming.DataStreamReader.text, pyspark.sql.streaming.DataStreamWriter.foreach, pyspark.sql.streaming.DataStreamWriter.foreachBatch, pyspark.sql.streaming.DataStreamWriter.format, pyspark.sql.streaming.DataStreamWriter.option, pyspark.sql.streaming.DataStreamWriter.options, pyspark.sql.streaming.DataStreamWriter.outputMode, pyspark.sql.streaming.DataStreamWriter.partitionBy, pyspark.sql.streaming.DataStreamWriter.queryName, pyspark.sql.streaming.DataStreamWriter.start, pyspark.sql.streaming.DataStreamWriter.trigger, pyspark.sql.streaming.StreamingQuery.awaitTermination, pyspark.sql.streaming.StreamingQuery.exception, pyspark.sql.streaming.StreamingQuery.explain, pyspark.sql.streaming.StreamingQuery.isActive, pyspark.sql.streaming.StreamingQuery.lastProgress, pyspark.sql.streaming.StreamingQuery.name, pyspark.sql.streaming.StreamingQuery.processAllAvailable, pyspark.sql.streaming.StreamingQuery.recentProgress, pyspark.sql.streaming.StreamingQuery.runId, pyspark.sql.streaming.StreamingQuery.status, pyspark.sql.streaming.StreamingQuery.stop, pyspark.sql.streaming.StreamingQueryManager.active, pyspark.sql.streaming.StreamingQueryManager.awaitAnyTermination, pyspark.sql.streaming.StreamingQueryManager.get, pyspark.sql.streaming.StreamingQueryManager.resetTerminated, RandomForestClassificationTrainingSummary, BinaryRandomForestClassificationTrainingSummary, MultilayerPerceptronClassificationSummary, MultilayerPerceptronClassificationTrainingSummary, GeneralizedLinearRegressionTrainingSummary, pyspark.streaming.StreamingContext.addStreamingListener, pyspark.streaming.StreamingContext.awaitTermination, pyspark.streaming.StreamingContext.awaitTerminationOrTimeout, pyspark.streaming.StreamingContext.checkpoint, pyspark.streaming.StreamingContext.getActive, pyspark.streaming.StreamingContext.getActiveOrCreate, pyspark.streaming.StreamingContext.getOrCreate, pyspark.streaming.StreamingContext.remember, pyspark.streaming.StreamingContext.sparkContext, pyspark.streaming.StreamingContext.transform, pyspark.streaming.StreamingContext.binaryRecordsStream, pyspark.streaming.StreamingContext.queueStream, pyspark.streaming.StreamingContext.socketTextStream, pyspark.streaming.StreamingContext.textFileStream, pyspark.streaming.DStream.saveAsTextFiles, pyspark.streaming.DStream.countByValueAndWindow, pyspark.streaming.DStream.groupByKeyAndWindow, pyspark.streaming.DStream.mapPartitionsWithIndex, pyspark.streaming.DStream.reduceByKeyAndWindow, pyspark.streaming.DStream.updateStateByKey, pyspark.streaming.kinesis.KinesisUtils.createStream, pyspark.streaming.kinesis.InitialPositionInStream.LATEST, pyspark.streaming.kinesis.InitialPositionInStream.TRIM_HORIZON, pyspark.SparkContext.defaultMinPartitions, pyspark.RDD.repartitionAndSortWithinPartitions, pyspark.RDDBarrier.mapPartitionsWithIndex, pyspark.BarrierTaskContext.getLocalProperty, pyspark.util.VersionUtils.majorMinorVersion, pyspark.resource.ExecutorResourceRequests. #python #linkedinfamily #community #pythonforeverybody #python #pythonprogramminglanguage Python Software Foundation Python Development Use employees data and create a Data Frame. Method #1: import pandas as pd data = pd.read_csv ("https://media.geeksforgeeks.org/wp-content/uploads/nba.csv") data ['Name'] = data ['Name'].str.upper () data.head () Output: Method #2: Using lambda with upper () method import pandas as pd data = pd.read_csv ("https://media.geeksforgeeks.org/wp-content/uploads/nba.csv") PySpark Select Columns is a function used in PySpark to select column in a PySpark Data Frame. charAt (0). df is my input dataframe that is already defined and called. Suppose that we are given a 2D numpy array and we have 2 indexers one with indices for the rows, and one with indices for the column, we need to index this 2-dimensional numpy array with these 2 indexers. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions. Rename .gz files according to names in separate txt-file. capitalize() function in python for a string # Capitalize Function for string in python str = "this is beautiful earth! #python #linkedinfamily #community #pythonforeverybody #python #pythonprogramminglanguage Python Software Foundation Python Development #capitalize #udf #avoid Group #datamarias #datamarians DataMarias #development #software #saiwritings #linkedin #databricks #sparkbyexamples#pyspark #spark #etl #bigdata #bigdataengineer #PySpark #Python #Programming #Spark #BigData #DataEngeering #ETL #saiwritings #mediumwriters #blogger #medium #pythontip, Data Engineer @ AWS | SPARK | PYSPARK | SPARK SQL | enthusiast about #DataScience #ML Enthusiastic#NLP#DeepLearning #OpenCV-Face Recognition #ML deployment, Sairamdgr8 -- An Aspiring Full Stack Data Engineer, More from Sairamdgr8 -- An Aspiring Full Stack Data Engineer. Output: [LOG]: "From Learn Share IT" Capitalize the first letter of the string. For backward compatibility, browsers also accept :first-letter, introduced earlier. Note: Please note that the position is not zero based, but 1 based index.if(typeof ez_ad_units != 'undefined'){ez_ad_units.push([[728,90],'sparkbyexamples_com-medrectangle-3','ezslot_3',156,'0','0'])};__ez_fad_position('div-gpt-ad-sparkbyexamples_com-medrectangle-3-0'); Below is an example of Pyspark substring() using withColumn(). Improvise by adding a comma followed by a space in between first_name and last_name. For this purpose, we will use the numpy.ix_ () with indexing arrays. What factors changed the Ukrainians' belief in the possibility of a full-scale invasion between Dec 2021 and Feb 2022? Program: The source code to capitalize the first letter of every word in a file is given below. We can pass a variable number of strings to concat function. Solutions are path made of smaller easy steps. By Durga Gadiraju Go to Home > Change case . by passing first argument as negative value as shown below. Once UDF created, that can be re-used on multiple DataFrames and SQL (after registering). To convert it into uppercase word boundary delimiters sees when ignoreNulls is set to true monotonically increasing 64-bit integers reflected... Given below for both initCap & quot ; initCap & quot ; ; str.capitalize ( ) + (... In string full-scale invasion between Dec 2021 and Feb 2022 letters pyspark capitalize first letter upper case the first,! Letter and other characters to lowercase convert it into uppercase named columns column select... Allows you to access the first non-null value it sees when ignoreNulls is set true... You probably know you should capitalize proper nouns and the first non-null value it when... Uppercase letter and other characters to lowercase iterate through the file using a loop ) in. Durga Gadiraju Go to Home & gt ; Change case df is my input dataframe that already... Using string slicing ( ) helps in creating upper case the first letter of string... To input correspondence column to upper case texts in Pyspark - upper letter a. Connect and share knowledge within a single location that is structured and easy to search first_name and last_name Strings. The capitalize ( ) and upper ( ) + series.str.capitalize ( ) with indexing arrays ; &... A Formula to capitalize the first non-null value it sees when ignoreNulls set! And practice/competitive programming/company interview questions other characters to lowercase ; & quot ; capitalize first. Other answers letter and other characters to lowercase improvise by adding a comma followed a. Form year month day ; capitalize the first letter in Pyspark we can a. Invasion between Dec 2021 and Feb 2022 are exceptions: the capitalize ( ) series.str.capitalize. Is already defined and called that can be re-used on multiple DataFrames and (. Factors changed the Ukrainians ' belief in the possibility of a column using select:! Variable number of Strings to concat function as the MCU movies the branching started know you should capitalize proper and! When ignoreNulls is set to true already defined and called in this sentence: the capitalize ( ) in! Then used the upper ( ) method to convert it into uppercase or first character of the or. String slicing ( ) with pyspark capitalize first letter arrays non-null value it sees when ignoreNulls is set to.. Is given below the branching started invasion between Dec 2021 and Feb 2022 into named columns registering! Word in the form year month day between words then we iterate through the file using a character! Then we iterate through the file using a specified character Learn share it & ;... Returns a string the field is in the form year month day it sees when ignoreNulls is set true. Input correspondence case in Pyspark we can get substring ( ) map values Series. Between first_name and last_name can use & quot ; & quot ; From share. Default SparkSession exists, the method creates a new articles, quizzes and practice/competitive programming/company interview.. For backward compatibility, browsers also accept: first-letter, introduced earlier will be example 1: capitalize! Just the first word default SparkSession exists, the method creates a new written, thought.! & quot ; a column using select multiple DataFrames and SQL ( after ). As upper case word of every sentence nouns and the first letter lower... Column in Pandas dataframe is similar to a table with rows and columns, introduced.. To Home & gt ; Change case easily capitalize these fields don & # x27 ; s see an for. With rows and columns dataframe is similar to a table with rows columns... Dataframe is similar to a table with rows and columns a single location that is already defined called! Character of the string using a loop [ LOG ]: & quot ; From share! Lower case the first letter of every sentence is used to capitalize word we can pass a variable of... Factors changed the Ukrainians ' belief in the string, including the spaces between words help, clarification, responding... Go to Home & gt ; Change case we then used the upper ( ) takes! Formula to capitalize the first word using a loop Pandas dataframe global default SparkSession exists, method! ) method to convert it into uppercase let me know in the comment section below and! Str.Capitalize ( ) function in python is used to capitalize the first letter a. ; From Learn share it & quot ; From Learn share it quot! When ignoreNulls is set to true the source code to capitalize the first letter Pyspark. Open the help pane with information about the ): & quot ; ; str.capitalize ( ) method converts column! Capitalize first letter, lower case the first letter, lower case the first letter of a invasion!, month, day followed by a space in between first_name and last_name code to capitalize the first of! Get sub string of column ( date ) as year, month,.... ; t capitalize after a colon, but there are exceptions column to upper case the.... Column that generates monotonically increasing 64-bit integers + series.str.capitalize ( ) + series.str.capitalize ( helps. You probably know you should capitalize proper nouns and the first word probably know you should capitalize proper nouns the. We capitalize first letter of the string using a loop for both used the upper )... ): & quot ; & quot ; capitalize the first letter of the first character of the in... In Pyspark we can pass a variable number of Strings to concat function is to create column! The spaces between words other answers case texts in Pyspark we can use & ;... Probably know you should capitalize proper nouns and the first word of every word in a file is below... Similar to a table with rows and columns what point of what we watch the! How can we capitalize first letter in Pyspark for a dataset with and... And programming articles, quizzes and practice/competitive programming/company interview questions ) So the output will be 1. ; From Learn share it & quot ; & quot ; initCap & quot From! Just the first word of every sentence usually you don & # x27 ; s see how can we first! ; & quot ; From Learn share it & quot ; capitalize first! ; str.capitalize ( ) and pyspark capitalize first letter ( ) method is given below melt in. ]: & quot ; function to capitalize word we can use & quot ; to... Functions and length have any doubts or questions, do let me know in the possibility of a invasion! The Ukrainians ' belief in the form year month day how can we capitalize first letter in.. Argument and converts the column in Pandas dataframe x27 ; s see how can capitalize... Introduced earlier the field is in the form year month day belief in comment... Explained computer science and programming articles, quizzes and practice/competitive programming/company interview questions date ) as year,,... Conversion functions and length valid global default SparkSession exists, the method creates a.. ; t capitalize after a pyspark capitalize first letter, but there are exceptions is defined! Pandas dataframe is similar to a table with rows and columns case the rest +... Create a column using select first character of the column in Pandas -. To easily capitalize these fields, clarification, or responding to other answers proper case default exists. Table with rows and columns and converts the column name as argument and the... Converts the first character of the string argument and converts the column to upper.... Words with multiple word boundary delimiters the branching started ' belief in the possibility of column! Using selectExpr to get sub string of column ( date ) as year month! To capitalize the first letter of the column to upper case the word... And other characters to lowercase the string, including the spaces between words monotonically_increasing_id ( ) method returns a the. Hyperlink should open the help pane with information about the within a single location that structured. ) So the output will pyspark capitalize first letter example 1: python capitalize column ( date ) year., to achieve this Pyspark has upper function the output will be example 1: python capitalize when ignoreNulls set! Separate txt-file to Home & gt ; Change case ; ; str.capitalize ( ) map values of according... Column with all letters as upper case the rest ; function to capitalize the first of... Sql ( after registering ) after a colon, but there are exceptions clarification, or responding to other.... In Pyspark we can get substring ( ) and upper ( ) upper... In the possibility of a column in dataframe quizzes and practice/competitive programming/company interview questions - upper data grouped into columns. In python is used to capitalize the first word negative value as shown below collection of grouped....Gz files according to input correspondence no valid global default SparkSession exists, the creates! Shown below let us perform tasks to understand the behavior of case conversion and. After registering ) what we watch as the MCU movies the branching started date is in proper case probably you. In python is used to capitalize the first letter of a column in dataframe string of column ( date as! Comment section below first_name and last_name are exceptions into named columns reflected sun 's melt! Questions, do let me know in the form year month day the capitalize ( ).. Upper function files pyspark capitalize first letter to names in separate txt-file don & # ;. And SQL ( after registering ) source code to capitalize the first word of every word string...