Sunday, September 8, 2024
HomeGolangRubbish worth from JSON? - Getting Assist

Rubbish worth from JSON? – Getting Assist


That is the specified end result: DB Fiddle – SQL Database Playground

PGAdmin have about the identical end result:
image

However from Go it places rubbish as an alternative of JSON worth:

[{“status_id”:0,“val”:“IkFrdGl2Ig==”},{“status_id”:1,“val”:“IkluYWt0aXYi”}]

The question:

WITH record AS 
(SELECT status_id, json_array_elements(lang) as row FROM standing)
SELECT status_id, row -> 'val' as val FROM record
WHERE row ->> 'key' = $1

The Go code:

// question to return an inventory with standing languages
func getlang(question string, val string) interface{} {
	if len(question) > 0 {
		var record []map[string]interface{}
		rows, err := db.Queryx(question, val)
		if err != nil {
			log("no information")
		}

		defer rows.Shut()

		for rows.Subsequent() {
			row := make(map[string]interface{})
			err = rows.MapScan(row)
			if err != nil {
				log(err.Error())
			}
			record = append(record, row)
		}

		rows.Shut()
		if len(record) == 0 {
			return ("norec")
		}
		return record
	}
	return nil
}

The question is OK exterior Go, however replaces worth with rubbish?
Regardless of if I exploit placeholder or not.
What am I doing fallacious?

This isn’t “rubbish”:

$ base64 --decode <<< IkFrdGl2Ig==
"Aktiv"                                                          
$ base64 --decode <<< IkluYWt0aXYi
"Inaktiv"

1 Like

Why do I’ve to decode solely the “val”? And never the remainder of the results of the question?

And why do all different queries work with out decoding however this one?

You haven’t proven every other queries, nor do you may have proven any particulars concerning the tables and colums in query.

Although from the “PGAdmin” screenshot it seems as if the val column is a JSON column. Unsure what sort the opposite issues are which “work”.

Different queries not utilizing JSON.

image

However the query stays the place and the best way to decode, and why solely ONE of the JSON columns? All different columns are appropriate UTF-8

Is nothing else utilizing JSON or is all different JSON “advantageous”?

That is my first JSON column. Different makes use of “regular” columns. And it’s simply the JSON column that’s not UTF-8.

Which DB driver do you employ? Does it assist JSON columns natively? Why do you employ a JSON column in any respect for simply plain strings?

How do you exchange the results of your DB question into the JSON you shared? How does it appear to be in case you examine the question end result straight utilizing fmt.Printf("%#v", record)?

I discovered one answer:

as an alternative of utilizing the “JSON object”
row -> 'val' (Which means “Energetic”)

I used the “worth”
row ->> ''val'' (Which means Energetic)

The query stays. Why can not Go deal with Object with UTF-8 like dbfiddle and PGAdmin?

Thats why I requested you which ones driver you employ and if it helps JSON colums.

From different languages I do know that there’s solely partial assist for JSON colums, eg. Elixirs ecto does solely assist maps and arrays within the JSON colums, and the values must be homogenously typed.

obtained your level fairly clear right here. You might have made so much simpler for me.

How do I do know if these drivers assist JSON?

The latter mentions within the README that it’s in maintainance mode, although the outline of it reads extra like a self depreciation in favour of GitHub – jackc/pgx: PostgreSQL driver and toolkit for Go, which in flip explicitly mentions JSON and JSONB assist of their README.

1 Like

@Sibert The driving force query and reply from @niamul21 is the best reply for this, but it surely would possibly miss a pleasant suggestion if you wish to have the struct scanning from sqlx and however with the native degree pgx assist.

On this case, lib/pq with sqlx means you aren’t utilizing native postgres sorts and likewise every part goes over to the db as textual content. The pgx library helps all of the postgres sorts and may have higher efficiency, will get safety updates (because it isn’t archived just like the repo you’re utilizing), and eventually, if you’d like the options of sqlx (a advantageous lib but it surely makes use of simply the database interface and may’t use native interfaces) then you’ll be able to most likely transfer to utilizing scany: https://github.com/georgysavva/scany

1 Like

This matter was mechanically closed 90 days after the final reply. New replies are not allowed.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments