/external/python/cpython3/Lib/test/ |
D | test_csv.py | 2 # csv package unit tests 9 import csv 25 Test the underlying C csv parser in ways that are not appropriate 35 self.assertRaises(csv.Error, ctor, arg, 'foo') 43 quoting=csv.QUOTE_ALL, quotechar='') 45 quoting=csv.QUOTE_ALL, quotechar=None) 48 self._test_arg_valid(csv.reader, []) 49 self.assertRaises(OSError, csv.reader, BadIterable()) 52 self._test_arg_valid(csv.writer, StringIO()) 57 self.assertRaises(OSError, csv.writer, BadWriter()) [all …]
|
/external/python/cpython2/Lib/test/ |
D | test_csv.py | 3 # csv package unit tests 11 import csv 19 Test the underlying C csv parser in ways that are not appropriate 29 self.assertRaises(csv.Error, ctor, arg, 'foo') 37 quoting=csv.QUOTE_ALL, quotechar='') 39 quoting=csv.QUOTE_ALL, quotechar=None) 42 self._test_arg_valid(csv.reader, []) 45 self._test_arg_valid(csv.writer, StringIO()) 55 self.assertEqual(obj.dialect.quoting, csv.QUOTE_MINIMAL) 66 self._test_default_attrs(csv.reader, []) [all …]
|
/external/python/cpython2/Doc/library/ |
D | csv.rst | 2 :mod:`csv` --- CSV File Reading and Writing 5 .. module:: csv 13 single: csv 16 The so-called CSV (Comma Separated Values) format is the most common import and 17 export format for spreadsheets and databases. There is no "CSV standard", so 21 make it annoying to process CSV files from multiple sources. Still, while the 27 The :mod:`csv` module implements classes to read and write tabular data in CSV 30 knowing the precise details of the CSV format used by Excel. Programmers can 31 also describe the CSV formats understood by other applications or define their 32 own special-purpose CSV formats. [all …]
|
/external/python/cpython3/Doc/library/ |
D | csv.rst | 1 :mod:`csv` --- CSV File Reading and Writing 4 .. module:: csv 9 **Source code:** :source:`Lib/csv.py` 12 single: csv 17 The so-called CSV (Comma Separated Values) format is the most common import and 18 export format for spreadsheets and databases. CSV format was used for many 22 differences can make it annoying to process CSV files from multiple sources. 28 The :mod:`csv` module implements classes to read and write tabular data in CSV 31 knowing the precise details of the CSV format used by Excel. Programmers can 32 also describe the CSV formats understood by other applications or define their [all …]
|
/external/rust/crates/csv/ |
D | README.md | 1 csv chapter 3 A fast and flexible CSV reader and writer for Rust, with support for Serde. 5 … status](https://api.travis-ci.org/BurntSushi/rust-csv.svg)](https://travis-ci.org/BurntSushi/rust… 6 …api/projects/status/github/BurntSushi/rust-csv?svg=true)](https://ci.appveyor.com/project/BurntSus… 7 [![](http://meritbadge.herokuapp.com/csv)](https://crates.io/crates/csv) 14 https://docs.rs/csv 17 [tutorial](https://docs.rs/csv/1.0.0/csv/tutorial/index.html) 27 csv = "1.1" 32 This example shows how to read CSV data from stdin and print each record to 36 [cookbook](https://docs.rs/csv/1.0.0/csv/cookbook/index.html). [all …]
|
D | Cargo.toml.orig | 2 name = "csv" 5 description = "Fast CSV parsing with support for serde." 6 documentation = "http://burntsushi.net/rustdoc/csv/" 7 homepage = "https://github.com/BurntSushi/rust-csv" 8 repository = "https://github.com/BurntSushi/rust-csv" 10 keywords = ["csv", "comma", "parser", "delimited", "serde"] 17 travis-ci = { repository = "BurntSushi/rust-csv" } 18 appveyor = { repository = "BurntSushi/rust-csv" } 21 members = ["csv-core", "csv-index"] 28 csv-core = { path = "csv-core", version = "0.1.6" }
|
D | Cargo.toml | 15 name = "csv" 19 description = "Fast CSV parsing with support for serde." 20 homepage = "https://github.com/BurntSushi/rust-csv" 21 documentation = "http://burntsushi.net/rustdoc/csv/" 23 keywords = ["csv", "comma", "parser", "delimited", "serde"] 26 repository = "https://github.com/BurntSushi/rust-csv" 39 [dependencies.csv-core] 54 repository = "BurntSushi/rust-csv" 57 repository = "BurntSushi/rust-csv"
|
/external/rust/crates/csv/src/ |
D | cookbook.rs | 2 A cookbook of examples for CSV reading and writing. 8 [`rust-csv`](https://github.com/BurntSushi/rust-csv) 11 For **reading** CSV: 18 For **writing** CSV: 24 [submit a pull request](https://github.com/BurntSushi/rust-csv/pulls) 29 This example shows how to read CSV data from stdin and print each record to 39 // Build the CSV reader and iterate over each record. 40 let mut rdr = csv::Reader::from_reader(io::stdin()); 61 $ git clone git://github.com/BurntSushi/rust-csv 62 $ cd rust-csv [all …]
|
D | tutorial.rs | 2 A tutorial for handling CSV data in Rust. 4 This tutorial will cover basic CSV reading and writing, automatic 5 (de)serialization with Serde, CSV transformations and performance. 24 1. [Reading CSV](#reading-csv) 29 1. [Writing CSV](#writing-csv) 38 * [CSV parsing without the standard library](#csv-parsing-without-the-standard-library) 43 In this section, we'll get you setup with a simple program that reads CSV data 56 `csv = "1.1"` to your `[dependencies]` section. At this point, your 66 csv = "1.1" 69 Next, let's build your project. Since you added the `csv` crate as a [all …]
|
D | lib.rs | 2 The `csv` crate provides a fast and flexible CSV reader and writer, with 9 programs that do CSV reading and writing. 21 for reading and writing CSV data respectively. 22 Correspondingly, to support CSV data with custom field or record delimiters 27 depending on whether you're reading or writing CSV data. 29 Unless you're using Serde, the standard CSV record types are 49 csv = "1.1" 62 This example shows how to read CSV data from stdin and print each record to 73 // Build the CSV reader and iterate over each record. 74 let mut rdr = csv::Reader::from_reader(io::stdin()); [all …]
|
/external/conscrypt/common/src/test/resources/crypto/ |
D | build_test_files.sh | 3 # Writes out all CSV files for crypto test data based on NIST test vectors. 15 cat "$1"/CBC*.rsp | parse_records.py > aes-cbc.csv 16 cat "$1"/CFB8*.rsp | parse_records.py > aes-cfb8.csv 17 cat "$1"/CFB128*.rsp | parse_records.py > aes-cfb128.csv 18 cat "$1"/ECB*.rsp | parse_records.py > aes-ecb.csv 19 cat "$1"/OFB*.rsp | parse_records.py > aes-ofb.csv 20 cat "$1"/TCBC*.rsp | parse_records.py > desede-cbc.csv 21 cat "$1"/TCFB8*.rsp | parse_records.py > desede-cfb8.csv 22 cat "$1"/TCFB64*.rsp | parse_records.py > desede-cfb64.csv 23 cat "$1"/TECB*.rsp | parse_records.py > desede-ecb.csv [all …]
|
/external/rust/crates/rusqlite/src/vtab/ |
D | csvtab.rs | 1 //! `feature = "csvtab"` CSV Virtual Table. 3 //! Port of [csv](http://www.sqlite.org/cgi/src/finfo?name=ext/misc/csv.c) C 4 //! extension: https://www.sqlite.org/csv.html 14 //! // Assum3e my_csv.csv 17 //! USING csv(filename = 'my_csv.csv') 38 /// `feature = "csvtab"` Register the "csv" module. 40 /// CREATE VIRTUAL TABLE vtab USING csv( 41 /// filename=FILENAME -- Name of file containing CSV content 42 /// [, schema=SCHEMA] -- Alternative CSV schema. 'CREATE TABLE x(col1 TEXT NOT NULL, col2 INT, ..… 43 /// [, header=YES|NO] -- First row of CSV defines the names of columns if "yes". Default "no". [all …]
|
/external/rust/crates/csv-core/ |
D | README.md | 1 csv-core 3 A fast CSV reader and write for use in a `no_std` context. This crate will 6 … status](https://api.travis-ci.org/BurntSushi/rust-csv.png)](https://travis-ci.org/BurntSushi/rust… 7 …api/projects/status/github/BurntSushi/rust-csv?svg=true)](https://ci.appveyor.com/project/BurntSus… 8 [![](http://meritbadge.herokuapp.com/csv-core)](https://crates.io/crates/csv-core) 14 https://docs.rs/csv-core 22 csv-core = "0.1.6" 28 Disabling this feature will drop `csv-core`'s dependency on `libc`. 31 ### Example: reading CSV 33 This example shows how to count the number of fields and records in CSV data. [all …]
|
D | Cargo.toml.orig | 2 name = "csv-core" 5 description = "Bare bones CSV parsing with no_std support." 6 documentation = "https://docs.rs/csv-core" 7 homepage = "https://github.com/BurntSushi/rust-csv" 8 repository = "https://github.com/BurntSushi/rust-csv" 10 keywords = ["csv", "comma", "parser", "delimited", "no_std"] 17 travis-ci = { repository = "BurntSushi/rust-csv" } 18 appveyor = { repository = "BurntSushi/rust-csv" }
|
D | Cargo.toml | 15 name = "csv-core" 18 description = "Bare bones CSV parsing with no_std support." 19 homepage = "https://github.com/BurntSushi/rust-csv" 20 documentation = "https://docs.rs/csv-core" 22 keywords = ["csv", "comma", "parser", "delimited", "no_std"] 25 repository = "https://github.com/BurntSushi/rust-csv" 40 repository = "BurntSushi/rust-csv" 43 repository = "BurntSushi/rust-csv"
|
/external/rust/crates/csv-core/src/ |
D | reader.rs | 7 // This may just be one of the more complicated CSV parsers you'll come across. 22 // the configuration of the CSV parser as given by the caller, and indeed, this 53 /// A pull based CSV reader. 55 /// This reader parses CSV data using a finite state machine. Callers can 58 /// Note that this CSV reader is somewhat encoding agnostic. The source data 66 /// A reader has two different ways to read CSV data, each with their own 69 /// * `read_field` - Copies a single CSV field into an output buffer while 72 /// * `read_record` - Copies an entire CSV record into an output buffer while 81 /// is the closest thing to a specification for CSV data. Unfortunately, 82 /// CSV data that is seen in the wild can vary significantly. Often, the CSV [all …]
|
D | lib.rs | 2 `csv-core` provides a fast CSV reader and writer for use in a `no_std` context. 7 If you're looking for more ergonomic CSV parsing routines, please use the 8 [`csv`](https://docs.rs/csv) crate. 12 This crate has two primary APIs. The `Reader` API provides a CSV parser, and 13 the `Writer` API provides a CSV writer. 15 # Example: reading CSV 17 This example shows how to count the number of fields and records in CSV data. 52 # Example: writing CSV 54 This example shows how to use the `Writer` API to write valid CSV data. Proper 60 // This is where we'll write out CSV data. [all …]
|
/external/autotest/site_utils/ |
D | perf_csv_uploader.py | 8 This module is used to upload csv files generated by performance related tests 13 contains a path to csv files need to be uploaded to cns. 16 3. Locate the csv files in GS, and upload them to desired cns location. 50 """Exception raised when csv files not found in GS.""" 54 """A class contains the information of a folder storing csv files to be 55 uploaded, and logic to upload the csv files. 62 # A class variable whose value is the cns path to upload the csv files to. 69 @param perf_csv_folder: Path of the folder contains csv files in test 86 """Get the url to the folder storing csv files in GS. 88 The url can be formulated based on csv folder, test_name and hostname. [all …]
|
/external/autotest/contrib/ |
D | manage_powerunit_info.py | 15 Step 1: create csv: 16 Put attributes in a csv file, e.g. mapping.csv. 17 Each line in mapping.csv consists of 25 ./manage_powerunit_info.py upload --csv mapping_file.csv 31 * Backup existing attributes for all hosts to a csv file: 32 ./manage_powerunit_info.py backup --csv backup.csv 35 import csv 74 """Read power unit information from csv and add to host attributes. 77 @param csv_file: A csv file, each line consists of device_hostname, 82 reader = csv.reader(f, delimiter=',') [all …]
|
/external/python/google-api-python-client/docs/dyn/ |
D | sqladmin_v1beta4.databases.html | 112 …verridden by any database specification in the import file. If fileType is CSV, one database must … 115 # CSV: The file contains CSV data. 118 "csvImportOptions": { # Options for importing data as CSV. 119 "table": "A String", # The table to which CSV data is imported. 120 …ns": [ # The columns to which CSV data is imported. If not specified, all columns of the database … 131 # CSV: The file contains CSV data. 133 "csvExportOptions": { # Options for exporting data as CSV. 137 …tabases are exported, except for the mysql system database. If fileType is CSV, you can specify on… 138 …# PostgreSQL instances: Specify exactly one database to be exported. If fileType is CSV, this data… 226 …verridden by any database specification in the import file. If fileType is CSV, one database must … [all …]
|
D | sqladmin_v1beta4.users.html | 107 …verridden by any database specification in the import file. If fileType is CSV, one database must … 110 # CSV: The file contains CSV data. 113 "csvImportOptions": { # Options for importing data as CSV. 114 "table": "A String", # The table to which CSV data is imported. 115 …ns": [ # The columns to which CSV data is imported. If not specified, all columns of the database … 126 # CSV: The file contains CSV data. 128 "csvExportOptions": { # Options for exporting data as CSV. 132 …tabases are exported, except for the mysql system database. If fileType is CSV, you can specify on… 133 …# PostgreSQL instances: Specify exactly one database to be exported. If fileType is CSV, this data… 196 …verridden by any database specification in the import file. If fileType is CSV, one database must … [all …]
|
D | sqladmin_v1beta4.operations.html | 102 …verridden by any database specification in the import file. If fileType is CSV, one database must … 105 # CSV: The file contains CSV data. 108 "csvImportOptions": { # Options for importing data as CSV. 109 "table": "A String", # The table to which CSV data is imported. 110 …ns": [ # The columns to which CSV data is imported. If not specified, all columns of the database … 121 # CSV: The file contains CSV data. 123 "csvExportOptions": { # Options for exporting data as CSV. 127 …tabases are exported, except for the mysql system database. If fileType is CSV, you can specify on… 128 …# PostgreSQL instances: Specify exactly one database to be exported. If fileType is CSV, this data… 183 …verridden by any database specification in the import file. If fileType is CSV, one database must … [all …]
|
/external/tensorflow/tensorflow/tools/tensorflow_builder/config_detector/data/ |
D | cuda_compute_capability.py | 17 """Retrieves CUDA compute capability from NVIDIA webpage and creates a `.csv`. 28 Creates `compute_capability.csv` file in the same directory by default. If 31 In order to use the new `.csv` as the golden, then it should replace the 32 original golden file (`./golden/compute_capability_golden.csv`) with the 52 CUDA_CC_GOLDEN_DIR = PATH_TO_DIR + "/data/golden/compute_capability_golden.csv" 95 `./golden/compute_capability_golden.csv` 127 generate_csv: Boolean for creating csv file to store results. 128 filename: String that is the name of the csv file (without `.csv` ending). 166 f_name = six.ensure_str(filename) + ".csv" 173 """Writes out a `.csv` file from an input dictionary. [all …]
|
/external/tensorflow/tensorflow/python/data/experimental/ops/ |
D | readers.py | 21 import csv 82 na_value: Additional string to recognize as a NA/NaN CSV value. 113 """Generator that yields rows of CSV file(s) in order.""" 116 rdr = csv.reader( 119 quoting=csv.QUOTE_MINIMAL if use_quote_delim else csv.QUOTE_NONE) 126 "Problem inferring types: CSV row has different number of fields " 134 """Infers column types from the first N valid CSV records of files.""" 162 "quoting": csv.QUOTE_MINIMAL if use_quote_delim else csv.QUOTE_NONE 166 column_names = next(csv.reader(f, **csv_kwargs)) 174 if next(csv.reader(f, **csv_kwargs)) != column_names: [all …]
|
/external/cldr/tools/java/org/unicode/cldr/draft/keyboard/ |
D | KeycodeMap.java | 30 * Creates the mapping from csv contents. The first line must contain the column headers 33 public static KeycodeMap fromCsv(String csv) { in fromCsv() argument 34 checkArgument(!csv.isEmpty()); in fromCsv() 35 List<String> lines = LINE_SPLITTER.splitToList(csv); in fromCsv() 36 checkArgument(lines.get(0).equals("keycode,iso"), "Missing csv headers"); in fromCsv() 39 // No fancy CSV parsing required since there are no strings. in fromCsv() 46 /** Retrieves the csv file relative to the class given. */ 49 String csv = Resources.toString(Resources.getResource(clazz, fileName), in fromResource() local 51 return fromCsv(csv); in fromResource()
|