J'essaie de transférer des données d'une bigquery à anthère via Beam
, cependant, l'erreur suivante apparaît:
WARNING:root:Retry with exponential backoff: waiting for 4.12307941111 seconds before retrying get_query_location because we caught exception: AttributeError: 'module' object has no attribute 'ensure_str'
Traceback for above exception (most recent call last):
File "/usr/local/lib/python2.7/site-packages/Apache_beam/utils/retry.py", line 197, in wrapper
return fun(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/Apache_beam/io/gcp/bigquery_tools.py", line 261, in get_query_location
response = self.client.jobs.Insert(request)
File "/usr/local/lib/python2.7/site-packages/Apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 342, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 703, in _RunMethod
download)
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 674, in PrepareHttpRequest
method_config.query_params, request, global_params)
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 551, in __ConstructQueryParams
global_params, self.__client.global_params)
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 357, in global_params
return encoding.CopyProtoMessage(self._default_global_params)
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py", line 112, in CopyProtoMessage
return JsonToMessage(type(message), MessageToJson(message))
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py", line 123, in JsonToMessage
return _ProtoJsonApiTools.Get().decode_message(message_type, message)
File "/usr/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py", line 309, in decode_message
message_type, result)
File "/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py", line 209, in decode_message
encoded_message = six.ensure_str(encoded_message)
Voici mes codes:
class SplitBDoFn(beam.DoFn):
Word_tag = 'Word_tag'
def process(self, element):
if element:
yield pvalue.TaggedOutput(self.Word_tag, element)
def run(argv=None):
parser = argparse.ArgumentParser()
known_args, pipeline_args = parser.parse_known_args(argv)
pipeline_args.extend([
'--runner=DirectRunner',
'--project=myproject',
'--gcs_location=US',
'--staging_location=gs://test-bucket/stage',
'--temp_location=gs://test-bucket/temp',
'--job_name=test-job',
])
pipeline_options = PipelineOptions(pipeline_args)
pipeline_options.view_as(SetupOptions).save_main_session = True
pipeline_options.view_as(StandardOptions).streaming = True
with beam.Pipeline(options = pipeline_options) as p:
bq_source = beam.io.BigQuerySource(query = 'select * from myproject:raw_data.events where utc_date = "2019-07-20"')
bq_data = p | beam.io.Read(bq_source)
multiple_lines = (
bq_data
| 'SplitBDoFn' >> (beam.ParDo(SplitBDoFn()).with_outputs(
SplitBDoFn.Word_tag)))
Word_tag = multiple_lines.Word_tag
(Word_tag
| "output_Word_tag" >> beam.io.WriteToBigQuery(
table = 'test',
dataset = 'temp',
project = 'myproject',
schema = data_schema,
# validate = True,
write_disposition = beam.io.BigQueryDisposition.WRITE_APPEND,
create_disposition = beam.io.BigQueryDisposition.CREATE_IF_NEEDED
))
Version faisceau: 2.13.0
Quelqu'un pourrait-il rencontrer cette question auparavant? ou une erreur dans mon code?
On dirait que vous avez été ajouté à six dans leur version 1.12.0, et qui devrait être regroupée via Apitools .
Je soupçonne que la cause première est que vous avez une version plus ancienne de six (1.11 ou plus) installée dans votre virtuelVironment. Pouvez-vous essayer de créer un nouveau virtualenv avant d'essayer à nouveau votre pipeline ou à exécuter l'exemple Exemple de démarrage rapide ?