datastax - Issue with blobs (Cassandra driver, Python) -


as part of testing, i'm using cassandra python driver trying select , delete rows on 1 of tables generated cassandra stress tool (standard1, on keyspace1). standard1 consists of several blob columns.

my approach extract (primary) key rows , run loop delete rows based on that.

the problem i'm facing looks cassandra driver converts blobs (hex bytes) strings, when try pass delete statement fails "cannot parse 'xxxxxx' hex bytes".

the data in table on cqlsh looks "0x303038333830343432", whereas below select extracts keys as, i.e., '069672027'.

is there way of preventing hex bytes being converted strings? other approach should using?

thanks!

query  = simplestatement("select (key) \"standard1\" limit 10", consistency_level=consistencylevel.local_quorum) rows = session.execute(query) row in rows:      query = simplestatement("delete \"standard1\" key = %s", consistency_level=consistencylevel.local_quorum)      session.execute(query, (row.key, )) 

when using simple (unprepared) statements, need create buffer string in order encoder recognize blob type.

http://datastax.github.io/python-driver/getting_started.html#type-conversions

try this:

session.execute(query, (buffer(row.key),) 

alternatively, binding prepared statement implicitly.


Comments

Popular posts from this blog

.net - Bulk insert via Dapper is slower than inserting rows one-by-one -

shared memory - gstreamer shmsrc and shmsink with h264 data -

python - Error importing VideoFileClip from moviepy : AttributeError: 'PermissionError' object has no attribute 'message' -