Apr 23 08:12:48.826006 ip-10-0-133-47 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:12:49.261266 ip-10-0-133-47 kubenswrapper[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:49.261266 ip-10-0-133-47 kubenswrapper[2559]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:12:49.261266 ip-10-0-133-47 kubenswrapper[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:49.261266 ip-10-0-133-47 kubenswrapper[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:12:49.261266 ip-10-0-133-47 kubenswrapper[2559]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:12:49.262492 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.262414 2559 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:12:49.270311 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270290 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:49.270311 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270306 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:49.270311 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270310 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:49.270311 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270314 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:49.270311 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270317 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270320 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270323 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270326 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270335 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270340 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270343 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270345 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270348 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270351 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270354 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270356 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270359 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270361 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270364 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270368 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270373 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270376 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270378 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:49.270498 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270381 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270383 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270386 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270389 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270391 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270394 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270396 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270399 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270401 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270404 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270407 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270412 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270416 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270419 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270421 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270424 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270427 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270430 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270432 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:49.270973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270435 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270438 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270440 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270443 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270445 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270448 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270451 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270453 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270456 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270459 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270461 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270463 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270466 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270468 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270471 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270473 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270476 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270480 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270482 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270485 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:49.271437 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270488 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270490 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270493 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270495 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270497 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270500 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270502 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270505 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270508 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270511 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270514 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270516 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270519 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270521 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270524 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270526 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270530 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270532 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270535 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270538 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:49.271964 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270541 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270543 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270546 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270548 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270940 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270946 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270949 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270952 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270954 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270957 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270960 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270962 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270966 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270968 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270971 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270976 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270979 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270983 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270986 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:49.272451 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270989 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270992 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270995 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.270998 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271001 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271003 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271006 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271008 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271011 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271014 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271016 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271020 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271022 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271025 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271028 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271031 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271033 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271036 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271039 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271041 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:49.272958 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271044 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271046 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271048 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271051 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271053 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271056 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271058 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271060 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271063 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271065 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271068 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271071 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271073 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271076 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271078 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271081 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271083 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271086 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271088 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271091 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:49.273457 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271094 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271096 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271099 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271101 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271109 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271112 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271114 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271116 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271119 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271122 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271124 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271127 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271129 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271132 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271134 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271136 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271139 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271142 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271145 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:49.274000 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271148 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271150 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271153 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271155 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271158 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271160 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271163 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271166 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271168 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271171 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271173 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271176 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271243 2559 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271250 2559 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271255 2559 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271260 2559 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271265 2559 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271270 2559 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271274 2559 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271278 2559 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271281 2559 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:12:49.274466 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271284 2559 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271288 2559 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271291 2559 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271294 2559 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271297 2559 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271300 2559 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271303 2559 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271306 2559 flags.go:64] FLAG: --cloud-config="" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271309 2559 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271312 2559 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271316 2559 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271319 2559 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271322 2559 flags.go:64] FLAG: --config-dir="" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271325 2559 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271328 2559 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271332 2559 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271336 2559 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271339 2559 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271342 2559 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271345 2559 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271348 2559 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271351 2559 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271354 2559 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271357 2559 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271361 2559 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:12:49.274995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271364 2559 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271367 2559 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271370 2559 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271378 2559 flags.go:64] FLAG: --enable-server="true" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271381 2559 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271385 2559 flags.go:64] FLAG: --event-burst="100" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271388 2559 flags.go:64] FLAG: --event-qps="50" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271391 2559 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271395 2559 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271398 2559 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271401 2559 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271404 2559 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271407 2559 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271411 2559 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271413 2559 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271416 2559 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271421 2559 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271424 2559 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271427 2559 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271430 2559 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271432 2559 flags.go:64] FLAG: --feature-gates="" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271436 2559 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271440 2559 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271443 2559 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271446 2559 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271449 2559 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:12:49.275596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271452 2559 flags.go:64] FLAG: --help="false" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271455 2559 flags.go:64] FLAG: --hostname-override="ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271458 2559 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271461 2559 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271464 2559 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271467 2559 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271471 2559 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271473 2559 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271476 2559 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271480 2559 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271483 2559 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271486 2559 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271489 2559 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271492 2559 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271495 2559 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271497 2559 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271500 2559 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271503 2559 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271506 2559 flags.go:64] FLAG: --lock-file="" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271509 2559 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271512 2559 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271515 2559 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271521 2559 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:12:49.276254 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271524 2559 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271527 2559 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271530 2559 flags.go:64] FLAG: --logging-format="text" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271532 2559 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271536 2559 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271539 2559 flags.go:64] FLAG: --manifest-url="" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271541 2559 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271545 2559 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271549 2559 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271552 2559 flags.go:64] FLAG: --max-pods="110" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271556 2559 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271559 2559 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271562 2559 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271565 2559 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271568 2559 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271571 2559 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271574 2559 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271581 2559 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271584 2559 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271587 2559 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271591 2559 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271594 2559 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271599 2559 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271602 2559 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:12:49.276845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271605 2559 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271607 2559 flags.go:64] FLAG: --port="10250" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271611 2559 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271614 2559 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03cd1d7ceda019106" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271617 2559 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271620 2559 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271623 2559 flags.go:64] FLAG: --register-node="true" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271627 2559 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271630 2559 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271634 2559 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271636 2559 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271639 2559 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271642 2559 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271646 2559 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271649 2559 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271652 2559 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271654 2559 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271657 2559 flags.go:64] FLAG: --runonce="false" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271660 2559 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271663 2559 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271667 2559 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271669 2559 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271672 2559 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271675 2559 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271679 2559 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271682 2559 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:12:49.277439 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271684 2559 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271687 2559 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271690 2559 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271693 2559 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271696 2559 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271713 2559 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271716 2559 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271721 2559 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271724 2559 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271727 2559 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271733 2559 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271736 2559 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271739 2559 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271746 2559 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271749 2559 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271753 2559 flags.go:64] FLAG: --v="2" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271757 2559 flags.go:64] FLAG: --version="false" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271761 2559 flags.go:64] FLAG: --vmodule="" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271765 2559 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.271768 2559 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271857 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271861 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271864 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271867 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:49.278105 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271870 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271872 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271875 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271877 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271880 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271883 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271885 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271888 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271890 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271893 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271896 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271898 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271901 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271904 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271906 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271909 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271911 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271914 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271916 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:49.278691 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271919 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271921 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271925 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271928 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271930 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271933 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271935 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271938 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271940 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271943 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271945 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271949 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271952 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271954 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271957 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271959 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271962 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271965 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271967 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271970 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:49.279233 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271973 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271976 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271979 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271981 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271984 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271986 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271989 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271992 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271994 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271997 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.271999 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272002 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272004 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272007 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272010 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272013 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272015 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272019 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272022 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272024 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:49.279744 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272027 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272029 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272032 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272036 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272039 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272042 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272044 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272047 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272049 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272052 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272055 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272057 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272060 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272062 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272065 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272067 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272070 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272072 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272075 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272078 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:49.280240 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272080 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272083 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.272087 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.272962 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.279294 2559 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.279310 2559 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279358 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279363 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279366 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279369 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279372 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279375 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279378 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279381 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279383 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:49.280765 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279386 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279389 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279391 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279394 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279396 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279399 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279402 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279404 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279407 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279409 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279412 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279415 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279418 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279420 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279423 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279425 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279428 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279431 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279434 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279437 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:49.281149 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279439 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279443 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279448 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279451 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279454 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279456 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279459 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279462 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279464 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279467 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279469 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279472 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279475 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279477 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279480 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279482 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279485 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279489 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279492 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:49.281631 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279495 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279498 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279501 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279503 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279506 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279509 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279511 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279515 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279517 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279520 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279523 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279526 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279528 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279531 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279534 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279536 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279539 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279541 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279544 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:49.282199 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279546 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279549 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279551 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279554 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279557 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279560 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279562 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279565 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279567 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279570 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279573 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279575 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279578 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279580 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279583 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279586 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279588 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279591 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:49.282668 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279593 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.279599 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279690 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279694 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279697 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279715 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279719 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279722 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279725 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279727 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279730 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279734 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279737 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279740 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279743 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:12:49.283177 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279746 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279749 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279751 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279754 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279757 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279760 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279762 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279765 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279767 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279770 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279772 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279775 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279778 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279780 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279783 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279785 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279788 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279790 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279793 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279796 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:12:49.283543 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279798 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279801 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279803 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279805 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279809 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279811 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279814 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279816 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279818 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279821 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279823 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279826 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279828 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279831 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279833 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279836 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279838 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279841 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279843 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279846 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:12:49.284061 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279849 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279851 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279854 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279856 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279859 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279861 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279864 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279866 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279868 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279871 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279874 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279877 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279880 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279883 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279885 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279888 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279890 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279893 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279895 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279898 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:12:49.284535 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279901 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279903 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279906 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279908 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279911 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279913 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279916 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279918 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279921 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279924 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279926 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279928 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:49.279931 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.279936 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.280587 2559 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:12:49.285066 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.283492 2559 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:12:49.285441 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.284326 2559 server.go:1019] "Starting client certificate rotation" Apr 23 08:12:49.285441 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.284420 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:12:49.285441 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.284458 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:12:49.313092 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.313076 2559 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:12:49.314853 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.314824 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:12:49.329415 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.329392 2559 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:12:49.335074 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.335059 2559 log.go:25] "Validated CRI v1 image API" Apr 23 08:12:49.336294 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.336278 2559 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:12:49.342830 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.342812 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:12:49.343665 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.343643 2559 fs.go:135] Filesystem UUIDs: map[68e5d70f-d8f4-4947-9041-eccfe0fc6bfe:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a0a9dc01-ed39-4b7a-976c-1f60c8795b98:/dev/nvme0n1p4] Apr 23 08:12:49.343756 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.343667 2559 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:12:49.349146 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.349028 2559 manager.go:217] Machine: {Timestamp:2026-04-23 08:12:49.347317882 +0000 UTC m=+0.401913160 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101566 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e5a1a335b23c8fe2d4dc4d4fb489e SystemUUID:ec2e5a1a-335b-23c8-fe2d-4dc4d4fb489e BootID:6496f71d-478c-41e0-ad51-b4e2f80143e3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:34:ec:b9:40:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:34:ec:b9:40:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:6d:35:ed:35:db Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:12:49.349146 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.349133 2559 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:12:49.349314 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.349233 2559 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:12:49.350394 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.350368 2559 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:12:49.350551 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.350397 2559 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:12:49.350636 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.350561 2559 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:12:49.350636 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.350573 2559 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:12:49.350636 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.350592 2559 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:12:49.352202 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.352190 2559 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:12:49.353255 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.353243 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:12:49.353526 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.353514 2559 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:12:49.356173 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.356159 2559 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:12:49.356243 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.356185 2559 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:12:49.356243 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.356201 2559 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:12:49.356243 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.356213 2559 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:12:49.356243 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.356226 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:12:49.357355 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.357341 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:12:49.357420 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.357364 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:12:49.361070 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.361045 2559 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:12:49.362367 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.362355 2559 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:12:49.362827 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.362814 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6zz4s" Apr 23 08:12:49.363837 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363826 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:12:49.363876 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363843 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:12:49.363876 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363850 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:12:49.363876 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363856 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:12:49.363876 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363862 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:12:49.363876 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363869 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:12:49.363876 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363875 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:12:49.364030 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363880 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:12:49.364030 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363888 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:12:49.364030 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363894 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:12:49.364030 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363903 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:12:49.364030 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.363911 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:12:49.364745 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.364734 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:12:49.364745 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.364744 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:12:49.367933 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.367918 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:12:49.368008 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.367951 2559 server.go:1295] "Started kubelet" Apr 23 08:12:49.368056 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.368009 2559 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:12:49.368298 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.368187 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:12:49.368454 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.368440 2559 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:12:49.368936 ip-10-0-133-47 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:12:49.371422 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.371381 2559 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:12:49.371547 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.371529 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6zz4s" Apr 23 08:12:49.371603 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.371558 2559 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:12:49.371768 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.371744 2559 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:12:49.372477 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.372453 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:12:49.372477 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.372453 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:12:49.377600 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.376681 2559 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-47.ec2.internal.18a8ee362835f341 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-47.ec2.internal,UID:ip-10-0-133-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-47.ec2.internal,},FirstTimestamp:2026-04-23 08:12:49.367929665 +0000 UTC m=+0.422524923,LastTimestamp:2026-04-23 08:12:49.367929665 +0000 UTC m=+0.422524923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-47.ec2.internal,}" Apr 23 08:12:49.380195 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.380176 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:12:49.380334 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.380249 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:12:49.380897 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.380880 2559 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:12:49.380897 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.380899 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:12:49.381056 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381012 2559 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:12:49.381056 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381045 2559 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:12:49.381281 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381253 2559 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:12:49.381281 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381282 2559 factory.go:55] Registering systemd factory Apr 23 08:12:49.381397 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381292 2559 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:12:49.381397 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.381298 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.381497 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.380877 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:12:49.381573 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381559 2559 factory.go:153] Registering CRI-O factory Apr 23 08:12:49.381573 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381574 2559 factory.go:223] Registration of the crio container factory successfully Apr 23 08:12:49.381667 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381593 2559 factory.go:103] Registering Raw factory Apr 23 08:12:49.381667 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.381604 2559 manager.go:1196] Started watching for new ooms in manager Apr 23 08:12:49.382804 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.382780 2559 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:12:49.382889 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.382810 2559 manager.go:319] Starting recovery of all containers Apr 23 08:12:49.393721 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.393553 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:49.394110 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.394095 2559 manager.go:324] Recovery completed Apr 23 08:12:49.398318 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.398305 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:49.398562 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.398547 2559 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-47.ec2.internal\" not found" node="ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.400640 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.400625 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:49.400743 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.400655 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:49.400743 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.400682 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:49.401137 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.401124 2559 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:12:49.401137 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.401135 2559 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:12:49.401224 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.401149 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:12:49.403247 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.403236 2559 policy_none.go:49] "None policy: Start" Apr 23 08:12:49.403287 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.403252 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:12:49.403287 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.403261 2559 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453255 2559 manager.go:341] "Starting Device Plugin manager" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.453289 2559 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453298 2559 server.go:85] "Starting device plugin registration server" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453477 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453488 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453612 2559 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453684 2559 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.453729 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.454359 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:12:49.456654 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.454389 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.520492 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.520442 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:12:49.521576 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.521555 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:12:49.521649 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.521582 2559 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:12:49.521649 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.521600 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:12:49.521649 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.521607 2559 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:12:49.521649 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.521637 2559 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:12:49.524857 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.524838 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:49.553835 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.553808 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:49.555020 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.555005 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:49.555084 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.555034 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:49.555084 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.555049 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:49.555084 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.555075 2559 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.563028 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.563015 2559 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.563069 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.563034 2559 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-47.ec2.internal\": node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.581099 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.581081 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.621903 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.621877 2559 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal"] Apr 23 08:12:49.621947 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.621942 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:49.622677 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.622663 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:49.622757 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.622687 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:49.622757 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.622696 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:49.623839 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.623827 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:49.623958 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.623944 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.623992 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.623975 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:49.624465 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.624450 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:49.624515 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.624479 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:49.624515 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.624498 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:49.624515 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.624507 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:49.624597 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.624480 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:49.624597 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.624562 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:49.625629 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.625616 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.625689 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.625639 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:12:49.626304 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.626287 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:12:49.626372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.626311 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:12:49.626372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.626328 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:12:49.648301 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.648280 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-47.ec2.internal\" not found" node="ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.652370 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.652356 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-47.ec2.internal\" not found" node="ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.681525 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.681505 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.683727 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.683712 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/02f1197a0b1540bfd79dfadea3147481-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal\" (UID: \"02f1197a0b1540bfd79dfadea3147481\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.683782 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.683763 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02f1197a0b1540bfd79dfadea3147481-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal\" (UID: \"02f1197a0b1540bfd79dfadea3147481\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.683820 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.683783 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ffda46011da9673651bbfae81aa86260-config\") pod \"kube-apiserver-proxy-ip-10-0-133-47.ec2.internal\" (UID: \"ffda46011da9673651bbfae81aa86260\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.781956 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.781902 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.784081 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.784066 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/02f1197a0b1540bfd79dfadea3147481-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal\" (UID: \"02f1197a0b1540bfd79dfadea3147481\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.784134 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.784096 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02f1197a0b1540bfd79dfadea3147481-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal\" (UID: \"02f1197a0b1540bfd79dfadea3147481\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.784134 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.784115 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ffda46011da9673651bbfae81aa86260-config\") pod \"kube-apiserver-proxy-ip-10-0-133-47.ec2.internal\" (UID: \"ffda46011da9673651bbfae81aa86260\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.784202 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.784169 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/02f1197a0b1540bfd79dfadea3147481-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal\" (UID: \"02f1197a0b1540bfd79dfadea3147481\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.784202 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.784178 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ffda46011da9673651bbfae81aa86260-config\") pod \"kube-apiserver-proxy-ip-10-0-133-47.ec2.internal\" (UID: \"ffda46011da9673651bbfae81aa86260\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.784202 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.784179 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02f1197a0b1540bfd79dfadea3147481-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal\" (UID: \"02f1197a0b1540bfd79dfadea3147481\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.882577 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.882549 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:49.950912 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.950884 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.955424 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:49.955406 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" Apr 23 08:12:49.983162 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:49.983143 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:50.083612 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.083593 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:50.184121 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.184102 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-47.ec2.internal\" not found" Apr 23 08:12:50.191759 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.191734 2559 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:50.280948 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.280924 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:50.283769 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.283754 2559 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:12:50.283877 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.283860 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:12:50.283940 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.283912 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:12:50.283940 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.283919 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:12:50.284010 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.283931 2559 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ae39a21dbe8e6432aa54c7de1516bd71-c3d800621f92e6aa.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.133.47:37228->32.195.223.205:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:12:50.284010 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.283963 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" Apr 23 08:12:50.301733 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.301713 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:12:50.356887 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.356839 2559 apiserver.go:52] "Watching apiserver" Apr 23 08:12:50.365336 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.365315 2559 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:12:50.367166 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.367141 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal","openshift-cluster-node-tuning-operator/tuned-gxzvp","openshift-image-registry/node-ca-lwlgs","openshift-multus/multus-tn7qf","openshift-multus/network-metrics-daemon-nkshk","openshift-network-operator/iptables-alerter-99xdt","openshift-ovn-kubernetes/ovnkube-node-cqpq2","kube-system/konnectivity-agent-6b6vz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw","openshift-dns/node-resolver-58xlf","openshift-multus/multus-additional-cni-plugins-5lbkz","openshift-network-diagnostics/network-check-target-zxxfs"] Apr 23 08:12:50.372153 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.372137 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.373604 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.373583 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.374321 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.374302 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.374410 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.374346 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.374410 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.374357 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gmfdz\"" Apr 23 08:12:50.375008 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.374982 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:07:49 +0000 UTC" deadline="2027-11-17 13:13:39.606723964 +0000 UTC" Apr 23 08:12:50.375008 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.375007 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13757h0m49.231718669s" Apr 23 08:12:50.375086 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.375068 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.375630 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.375609 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:12:50.375630 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.375628 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vkhph\"" Apr 23 08:12:50.375866 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.375679 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.376025 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.376005 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.376172 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.376159 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.376241 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.376222 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.376293 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.376231 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:12:50.377156 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.377140 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:12:50.377156 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.377149 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.377293 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.377144 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:12:50.377293 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.377211 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nl2hr\"" Apr 23 08:12:50.377500 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.377487 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.377773 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.377758 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.378357 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.378338 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jmfxl\"" Apr 23 08:12:50.378428 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.378342 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.378473 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.378429 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:12:50.378599 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.378587 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.379119 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.379106 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.379850 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.379833 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:12:50.379923 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.379834 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:12:50.380043 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380029 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:12:50.380080 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380034 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.380080 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380063 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.380137 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380104 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qz2cb\"" Apr 23 08:12:50.380330 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380315 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:12:50.380379 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380366 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:12:50.380715 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.380688 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.381145 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.381127 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:12:50.381145 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.381140 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8gssx\"" Apr 23 08:12:50.381272 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.381212 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:12:50.382297 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.382279 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.382793 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.382775 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:12:50.382884 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.382854 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.382884 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.382860 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-x5zd7\"" Apr 23 08:12:50.382989 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.382902 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.384238 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.384213 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.384482 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.384443 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:12:50.386079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.385047 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:12:50.386079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386013 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7qspq\"" Apr 23 08:12:50.386397 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386379 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:50.386500 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386486 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:12:50.386548 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.386503 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:12:50.386767 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386750 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4jxws\"" Apr 23 08:12:50.386859 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386838 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492hr\" (UniqueName: \"kubernetes.io/projected/49916299-de53-445c-b3f3-1fc134a6e4cc-kube-api-access-492hr\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.386915 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386882 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4bk\" (UniqueName: \"kubernetes.io/projected/4f98cb24-fbc1-4718-bb62-2d985cadf144-kube-api-access-2p4bk\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.386915 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386909 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-cni-bin\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387012 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386938 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-env-overrides\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387012 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.386987 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-os-release\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.387138 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387028 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:12:50.387138 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387051 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-k8s-cni-cncf-io\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.387138 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387104 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xv7\" (UniqueName: \"kubernetes.io/projected/58cfa3d8-1954-40b4-ac7c-f082a1e07777-kube-api-access-s5xv7\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.387292 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387171 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-var-lib-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387292 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387214 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-kubernetes\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.387292 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387265 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-host\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.387436 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387322 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-device-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.387436 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387361 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f98cb24-fbc1-4718-bb62-2d985cadf144-host-slash\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.387436 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387389 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-log-socket\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387585 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387470 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysctl-d\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.387585 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387514 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-etc-selinux\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.387585 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387555 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cfbc15a5-2478-44d8-902f-9d7269635055-etc-tuned\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.387735 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387613 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49916299-de53-445c-b3f3-1fc134a6e4cc-serviceca\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.387735 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387649 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-kubelet\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387735 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387724 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-run-netns\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387882 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387751 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-ovn\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387882 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387780 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-ovnkube-config\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.387882 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387825 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be5235d2-68c0-4721-b23c-a3e24721ae65-ovn-node-metrics-cert\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.388024 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387886 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-ovnkube-script-lib\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.388024 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387921 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpc7s\" (UniqueName: \"kubernetes.io/projected/cfbc15a5-2478-44d8-902f-9d7269635055-kube-api-access-xpc7s\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.388024 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.387953 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-socket-dir-parent\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.388024 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388002 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-multus-certs\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.388222 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388038 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-modprobe-d\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.388222 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388106 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-lib-modules\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.388222 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388194 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-cni-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.388366 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388242 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-etc-kubernetes\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.388366 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388275 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hzn\" (UniqueName: \"kubernetes.io/projected/ad7c4f52-9a6e-427d-8a37-1c27216d412e-kube-api-access-q9hzn\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.388646 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388621 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-sys\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.388800 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388659 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-socket-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.388800 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388681 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-registration-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.388800 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388718 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-cnibin\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.388800 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388755 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.388800 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388774 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysconfig\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.388800 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388793 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388814 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-cni-bin\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388842 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f98cb24-fbc1-4718-bb62-2d985cadf144-iptables-alerter-script\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388863 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-systemd-units\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388884 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388904 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-var-lib-kubelet\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388926 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a708d86-77eb-45d6-9cb2-d56380306e76-tmp-dir\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.388967 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-netns\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389000 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-cni-multus\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389021 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-hostroot\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389042 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-etc-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389059 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3a4bfb9-cf61-4040-ba6a-b48997b19fce-agent-certs\") pod \"konnectivity-agent-6b6vz\" (UID: \"e3a4bfb9-cf61-4040-ba6a-b48997b19fce\") " pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.389094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389080 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-conf-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389100 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysctl-conf\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389121 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a708d86-77eb-45d6-9cb2-d56380306e76-hosts-file\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389135 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcghw\" (UniqueName: \"kubernetes.io/projected/7a708d86-77eb-45d6-9cb2-d56380306e76-kube-api-access-bcghw\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389151 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-system-cni-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389175 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-node-log\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389207 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-cni-netd\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389248 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58cfa3d8-1954-40b4-ac7c-f082a1e07777-cni-binary-copy\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389264 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-kubelet\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389278 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-daemon-config\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389327 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-run\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389352 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-sys-fs\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389368 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm7zl\" (UniqueName: \"kubernetes.io/projected/159f214d-7437-426c-9ffa-1470336cf132-kube-api-access-lm7zl\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389383 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-slash\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389423 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389470 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2spmh\" (UniqueName: \"kubernetes.io/projected/be5235d2-68c0-4721-b23c-a3e24721ae65-kube-api-access-2spmh\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389488 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3a4bfb9-cf61-4040-ba6a-b48997b19fce-konnectivity-ca\") pod \"konnectivity-agent-6b6vz\" (UID: \"e3a4bfb9-cf61-4040-ba6a-b48997b19fce\") " pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.389516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389502 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49916299-de53-445c-b3f3-1fc134a6e4cc-host\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.390013 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389516 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-systemd\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.390013 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389536 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.390013 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389552 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-systemd\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.390013 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.389565 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfbc15a5-2478-44d8-902f-9d7269635055-tmp\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.393325 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.393305 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:12:50.415159 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.415138 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-f9g5f" Apr 23 08:12:50.424467 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.424447 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-f9g5f" Apr 23 08:12:50.479748 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.479719 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f1197a0b1540bfd79dfadea3147481.slice/crio-c8759443678e304376e185e814bdf855c56a115d9f5714ede926a730a164898f WatchSource:0}: Error finding container c8759443678e304376e185e814bdf855c56a115d9f5714ede926a730a164898f: Status 404 returned error can't find the container with id c8759443678e304376e185e814bdf855c56a115d9f5714ede926a730a164898f Apr 23 08:12:50.479961 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.479943 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffda46011da9673651bbfae81aa86260.slice/crio-a3b3cdac954ab2904a3d4f77a3e13cf4f8f34468587d71b8f021227a1604270f WatchSource:0}: Error finding container a3b3cdac954ab2904a3d4f77a3e13cf4f8f34468587d71b8f021227a1604270f: Status 404 returned error can't find the container with id a3b3cdac954ab2904a3d4f77a3e13cf4f8f34468587d71b8f021227a1604270f Apr 23 08:12:50.482519 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.482501 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:12:50.486856 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.486834 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:12:50.490568 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490543 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cnibin\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.490648 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490589 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-system-cni-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.490648 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490616 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-node-log\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.490648 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490640 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-cni-netd\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490660 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58cfa3d8-1954-40b4-ac7c-f082a1e07777-cni-binary-copy\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490683 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-kubelet\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490681 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-system-cni-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490727 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-daemon-config\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490746 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-cni-netd\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490767 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490816 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-run\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490853 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-sys-fs\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490877 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm7zl\" (UniqueName: \"kubernetes.io/projected/159f214d-7437-426c-9ffa-1470336cf132-kube-api-access-lm7zl\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490903 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-system-cni-dir\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490816 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-kubelet\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490922 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-run\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.490973 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490946 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-slash\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.490980 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491006 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2spmh\" (UniqueName: \"kubernetes.io/projected/be5235d2-68c0-4721-b23c-a3e24721ae65-kube-api-access-2spmh\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491031 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3a4bfb9-cf61-4040-ba6a-b48997b19fce-konnectivity-ca\") pod \"konnectivity-agent-6b6vz\" (UID: \"e3a4bfb9-cf61-4040-ba6a-b48997b19fce\") " pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491057 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49916299-de53-445c-b3f3-1fc134a6e4cc-host\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491063 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-sys-fs\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491080 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-systemd\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491106 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491131 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-systemd\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491152 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfbc15a5-2478-44d8-902f-9d7269635055-tmp\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491176 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-os-release\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491202 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-492hr\" (UniqueName: \"kubernetes.io/projected/49916299-de53-445c-b3f3-1fc134a6e4cc-kube-api-access-492hr\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491216 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58cfa3d8-1954-40b4-ac7c-f082a1e07777-cni-binary-copy\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491239 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-slash\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491270 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p4bk\" (UniqueName: \"kubernetes.io/projected/4f98cb24-fbc1-4718-bb62-2d985cadf144-kube-api-access-2p4bk\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491284 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491286 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-daemon-config\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.491578 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491299 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-cni-bin\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491342 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491342 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-env-overrides\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491367 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-systemd\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491375 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-systemd\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491377 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-os-release\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491415 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-cni-bin\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491432 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49916299-de53-445c-b3f3-1fc134a6e4cc-host\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491441 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-k8s-cni-cncf-io\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491443 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-os-release\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491466 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xv7\" (UniqueName: \"kubernetes.io/projected/58cfa3d8-1954-40b4-ac7c-f082a1e07777-kube-api-access-s5xv7\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491492 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-var-lib-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491512 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-kubernetes\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491534 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-host\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491576 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-device-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491600 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f98cb24-fbc1-4718-bb62-2d985cadf144-host-slash\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491607 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-var-lib-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491623 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-log-socket\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.492391 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491648 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysctl-d\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491657 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-k8s-cni-cncf-io\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491672 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-etc-selinux\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491750 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-device-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491761 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-env-overrides\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491776 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f98cb24-fbc1-4718-bb62-2d985cadf144-host-slash\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491753 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-log-socket\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491786 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-etc-selinux\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491808 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-kubernetes\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491823 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-host\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491827 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491863 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysctl-d\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491861 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491900 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cfbc15a5-2478-44d8-902f-9d7269635055-etc-tuned\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491930 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49916299-de53-445c-b3f3-1fc134a6e4cc-serviceca\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491929 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e3a4bfb9-cf61-4040-ba6a-b48997b19fce-konnectivity-ca\") pod \"konnectivity-agent-6b6vz\" (UID: \"e3a4bfb9-cf61-4040-ba6a-b48997b19fce\") " pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491950 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-kubelet\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491957 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-node-log\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491986 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-run-netns\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.491987 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-kubelet\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492011 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-ovn\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492019 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-host-run-netns\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492036 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-ovnkube-config\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492058 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-ovn\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492060 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be5235d2-68c0-4721-b23c-a3e24721ae65-ovn-node-metrics-cert\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492050 2559 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492162 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-ovnkube-script-lib\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492190 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpc7s\" (UniqueName: \"kubernetes.io/projected/cfbc15a5-2478-44d8-902f-9d7269635055-kube-api-access-xpc7s\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492264 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-socket-dir-parent\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492288 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-multus-certs\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492319 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492344 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-modprobe-d\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492351 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49916299-de53-445c-b3f3-1fc134a6e4cc-serviceca\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492365 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-lib-modules\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492407 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-cni-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.493999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492462 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-lib-modules\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492465 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-etc-kubernetes\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492499 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-etc-kubernetes\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492501 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hzn\" (UniqueName: \"kubernetes.io/projected/ad7c4f52-9a6e-427d-8a37-1c27216d412e-kube-api-access-q9hzn\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492535 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-ovnkube-config\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492618 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-cni-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492640 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-multus-certs\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492537 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cni-binary-copy\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492734 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-socket-dir-parent\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492745 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be5235d2-68c0-4721-b23c-a3e24721ae65-ovnkube-script-lib\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492848 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-modprobe-d\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492917 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-sys\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492958 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-socket-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.492985 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-registration-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493012 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-cnibin\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493035 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493058 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysconfig\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.494936 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493081 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493105 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-cni-bin\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493129 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f98cb24-fbc1-4718-bb62-2d985cadf144-iptables-alerter-script\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493147 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-socket-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493153 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-systemd-units\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493200 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-systemd-units\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493214 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493240 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-var-lib-kubelet\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493248 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-registration-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493261 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a708d86-77eb-45d6-9cb2-d56380306e76-tmp-dir\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493294 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-run-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493296 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-netns\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493326 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-run-netns\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493333 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-cni-multus\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493359 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-hostroot\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.493363 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493370 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysconfig\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493389 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grs2v\" (UniqueName: \"kubernetes.io/projected/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-kube-api-access-grs2v\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.495623 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493410 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/159f214d-7437-426c-9ffa-1470336cf132-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.493430 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:50.993408968 +0000 UTC m=+2.048004214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493451 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-etc-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493460 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-sys\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493479 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3a4bfb9-cf61-4040-ba6a-b48997b19fce-agent-certs\") pod \"konnectivity-agent-6b6vz\" (UID: \"e3a4bfb9-cf61-4040-ba6a-b48997b19fce\") " pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493495 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-cni-bin\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493502 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-conf-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493540 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-cnibin\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493560 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysctl-conf\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493574 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-host-var-lib-cni-multus\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493597 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a708d86-77eb-45d6-9cb2-d56380306e76-hosts-file\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493611 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-hostroot\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493622 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghw\" (UniqueName: \"kubernetes.io/projected/7a708d86-77eb-45d6-9cb2-d56380306e76-kube-api-access-bcghw\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493908 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58cfa3d8-1954-40b4-ac7c-f082a1e07777-multus-conf-dir\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.493982 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f98cb24-fbc1-4718-bb62-2d985cadf144-iptables-alerter-script\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.494016 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-etc-sysctl-conf\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.494064 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a708d86-77eb-45d6-9cb2-d56380306e76-hosts-file\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.496332 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.494356 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a708d86-77eb-45d6-9cb2-d56380306e76-tmp-dir\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.496841 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.494445 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfbc15a5-2478-44d8-902f-9d7269635055-var-lib-kubelet\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.496841 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.494494 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be5235d2-68c0-4721-b23c-a3e24721ae65-etc-openvswitch\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.496841 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.495555 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cfbc15a5-2478-44d8-902f-9d7269635055-etc-tuned\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.496841 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.495609 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfbc15a5-2478-44d8-902f-9d7269635055-tmp\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.496841 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.496116 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be5235d2-68c0-4721-b23c-a3e24721ae65-ovn-node-metrics-cert\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.496841 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.496245 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e3a4bfb9-cf61-4040-ba6a-b48997b19fce-agent-certs\") pod \"konnectivity-agent-6b6vz\" (UID: \"e3a4bfb9-cf61-4040-ba6a-b48997b19fce\") " pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.499859 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.499838 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p4bk\" (UniqueName: \"kubernetes.io/projected/4f98cb24-fbc1-4718-bb62-2d985cadf144-kube-api-access-2p4bk\") pod \"iptables-alerter-99xdt\" (UID: \"4f98cb24-fbc1-4718-bb62-2d985cadf144\") " pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.500412 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.500389 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xv7\" (UniqueName: \"kubernetes.io/projected/58cfa3d8-1954-40b4-ac7c-f082a1e07777-kube-api-access-s5xv7\") pod \"multus-tn7qf\" (UID: \"58cfa3d8-1954-40b4-ac7c-f082a1e07777\") " pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.500662 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.500634 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-492hr\" (UniqueName: \"kubernetes.io/projected/49916299-de53-445c-b3f3-1fc134a6e4cc-kube-api-access-492hr\") pod \"node-ca-lwlgs\" (UID: \"49916299-de53-445c-b3f3-1fc134a6e4cc\") " pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.500797 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.500780 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2spmh\" (UniqueName: \"kubernetes.io/projected/be5235d2-68c0-4721-b23c-a3e24721ae65-kube-api-access-2spmh\") pod \"ovnkube-node-cqpq2\" (UID: \"be5235d2-68c0-4721-b23c-a3e24721ae65\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.500944 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.500928 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm7zl\" (UniqueName: \"kubernetes.io/projected/159f214d-7437-426c-9ffa-1470336cf132-kube-api-access-lm7zl\") pod \"aws-ebs-csi-driver-node-r8lvw\" (UID: \"159f214d-7437-426c-9ffa-1470336cf132\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.501195 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.501181 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" Apr 23 08:12:50.503459 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.503429 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hzn\" (UniqueName: \"kubernetes.io/projected/ad7c4f52-9a6e-427d-8a37-1c27216d412e-kube-api-access-q9hzn\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.503558 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.503539 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpc7s\" (UniqueName: \"kubernetes.io/projected/cfbc15a5-2478-44d8-902f-9d7269635055-kube-api-access-xpc7s\") pod \"tuned-gxzvp\" (UID: \"cfbc15a5-2478-44d8-902f-9d7269635055\") " pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.504777 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.504729 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcghw\" (UniqueName: \"kubernetes.io/projected/7a708d86-77eb-45d6-9cb2-d56380306e76-kube-api-access-bcghw\") pod \"node-resolver-58xlf\" (UID: \"7a708d86-77eb-45d6-9cb2-d56380306e76\") " pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.507903 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.507881 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod159f214d_7437_426c_9ffa_1470336cf132.slice/crio-02eb9645854438d24ae797743667ef2c783c369d4370784f96363c8c421981c3 WatchSource:0}: Error finding container 02eb9645854438d24ae797743667ef2c783c369d4370784f96363c8c421981c3: Status 404 returned error can't find the container with id 02eb9645854438d24ae797743667ef2c783c369d4370784f96363c8c421981c3 Apr 23 08:12:50.507985 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.507942 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58xlf" Apr 23 08:12:50.514669 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.514651 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a708d86_77eb_45d6_9cb2_d56380306e76.slice/crio-dc25cf4a8f831b30e9e9424b12005932b08f89555b5ae8a845cc57d836e64f9d WatchSource:0}: Error finding container dc25cf4a8f831b30e9e9424b12005932b08f89555b5ae8a845cc57d836e64f9d: Status 404 returned error can't find the container with id dc25cf4a8f831b30e9e9424b12005932b08f89555b5ae8a845cc57d836e64f9d Apr 23 08:12:50.523808 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.523774 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58xlf" event={"ID":"7a708d86-77eb-45d6-9cb2-d56380306e76","Type":"ContainerStarted","Data":"dc25cf4a8f831b30e9e9424b12005932b08f89555b5ae8a845cc57d836e64f9d"} Apr 23 08:12:50.524608 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.524590 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" event={"ID":"159f214d-7437-426c-9ffa-1470336cf132","Type":"ContainerStarted","Data":"02eb9645854438d24ae797743667ef2c783c369d4370784f96363c8c421981c3"} Apr 23 08:12:50.525468 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.525450 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" event={"ID":"02f1197a0b1540bfd79dfadea3147481","Type":"ContainerStarted","Data":"c8759443678e304376e185e814bdf855c56a115d9f5714ede926a730a164898f"} Apr 23 08:12:50.526305 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.526289 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" event={"ID":"ffda46011da9673651bbfae81aa86260","Type":"ContainerStarted","Data":"a3b3cdac954ab2904a3d4f77a3e13cf4f8f34468587d71b8f021227a1604270f"} Apr 23 08:12:50.594183 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594158 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grs2v\" (UniqueName: \"kubernetes.io/projected/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-kube-api-access-grs2v\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594275 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594228 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cnibin\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594275 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594259 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:50.594376 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594323 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cnibin\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594458 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594378 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-system-cni-dir\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594458 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594419 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-os-release\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594458 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594454 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594461 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-system-cni-dir\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594485 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594512 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-os-release\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594518 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594568 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cni-binary-copy\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594640 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594917 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594901 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.594981 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594965 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.595016 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.594992 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-cni-binary-copy\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.600525 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.600512 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:50.600574 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.600529 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:50.600574 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.600538 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:50.600643 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.600580 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:51.100567112 +0000 UTC m=+2.155162357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:50.602790 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.602775 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grs2v\" (UniqueName: \"kubernetes.io/projected/f22c4ac1-2d2c-4f45-9e34-42cd608dab18-kube-api-access-grs2v\") pod \"multus-additional-cni-plugins-5lbkz\" (UID: \"f22c4ac1-2d2c-4f45-9e34-42cd608dab18\") " pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.701887 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.701844 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" Apr 23 08:12:50.707990 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.707961 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfbc15a5_2478_44d8_902f_9d7269635055.slice/crio-14abe2581423533df129f1cc90d6ed12e10c0ba0443186bbcfb811ef76d71991 WatchSource:0}: Error finding container 14abe2581423533df129f1cc90d6ed12e10c0ba0443186bbcfb811ef76d71991: Status 404 returned error can't find the container with id 14abe2581423533df129f1cc90d6ed12e10c0ba0443186bbcfb811ef76d71991 Apr 23 08:12:50.710663 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.710649 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lwlgs" Apr 23 08:12:50.716665 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.716639 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49916299_de53_445c_b3f3_1fc134a6e4cc.slice/crio-96f53fee1311b1cdc3ba783fcd29df65211113d6dd5ca6c7bf84284dc3646789 WatchSource:0}: Error finding container 96f53fee1311b1cdc3ba783fcd29df65211113d6dd5ca6c7bf84284dc3646789: Status 404 returned error can't find the container with id 96f53fee1311b1cdc3ba783fcd29df65211113d6dd5ca6c7bf84284dc3646789 Apr 23 08:12:50.724293 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.724277 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tn7qf" Apr 23 08:12:50.729324 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.729297 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58cfa3d8_1954_40b4_ac7c_f082a1e07777.slice/crio-7dc71c2053c7186d7da778a97b3f4e94d9d45a0ef19187eb7d46e7d790263ca4 WatchSource:0}: Error finding container 7dc71c2053c7186d7da778a97b3f4e94d9d45a0ef19187eb7d46e7d790263ca4: Status 404 returned error can't find the container with id 7dc71c2053c7186d7da778a97b3f4e94d9d45a0ef19187eb7d46e7d790263ca4 Apr 23 08:12:50.738759 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.738744 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-99xdt" Apr 23 08:12:50.744331 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.744315 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:12:50.745961 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.745923 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f98cb24_fbc1_4718_bb62_2d985cadf144.slice/crio-1ad890d410396a37ff10e7402ec9a87d8a26c650230311f450fc256b5c558aa6 WatchSource:0}: Error finding container 1ad890d410396a37ff10e7402ec9a87d8a26c650230311f450fc256b5c558aa6: Status 404 returned error can't find the container with id 1ad890d410396a37ff10e7402ec9a87d8a26c650230311f450fc256b5c558aa6 Apr 23 08:12:50.750874 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.750845 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5235d2_68c0_4721_b23c_a3e24721ae65.slice/crio-0938f67e2b3335e7941d279927c038183fc905ce124b5fb041e36bfae782be10 WatchSource:0}: Error finding container 0938f67e2b3335e7941d279927c038183fc905ce124b5fb041e36bfae782be10: Status 404 returned error can't find the container with id 0938f67e2b3335e7941d279927c038183fc905ce124b5fb041e36bfae782be10 Apr 23 08:12:50.765915 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.765895 2559 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:50.787997 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.787977 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:12:50.793389 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.793367 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a4bfb9_cf61_4040_ba6a_b48997b19fce.slice/crio-3684940af9ce34b8921e1603ccba78043a9d1b67a02274864a0d503598845e03 WatchSource:0}: Error finding container 3684940af9ce34b8921e1603ccba78043a9d1b67a02274864a0d503598845e03: Status 404 returned error can't find the container with id 3684940af9ce34b8921e1603ccba78043a9d1b67a02274864a0d503598845e03 Apr 23 08:12:50.813653 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.813632 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" Apr 23 08:12:50.818925 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:12:50.818907 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22c4ac1_2d2c_4f45_9e34_42cd608dab18.slice/crio-63ed633d9bb4e73831b125870501029d76a4a010f78c5952a5a972b78234d6aa WatchSource:0}: Error finding container 63ed633d9bb4e73831b125870501029d76a4a010f78c5952a5a972b78234d6aa: Status 404 returned error can't find the container with id 63ed633d9bb4e73831b125870501029d76a4a010f78c5952a5a972b78234d6aa Apr 23 08:12:50.996020 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:50.995946 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:50.996172 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.996050 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:50.996172 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:50.996108 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:51.996088145 +0000 UTC m=+3.050683398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:51.194727 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.194551 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:51.197950 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.197922 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:51.198127 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:51.198110 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:51.198190 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:51.198135 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:51.198190 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:51.198149 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:51.198298 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:51.198203 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:52.198184305 +0000 UTC m=+3.252779552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:51.389617 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.389402 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:51.425234 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.425149 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:07:50 +0000 UTC" deadline="2027-10-28 06:17:49.500749426 +0000 UTC" Apr 23 08:12:51.425234 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.425178 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13270h4m58.075574358s" Apr 23 08:12:51.549773 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.549735 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6b6vz" event={"ID":"e3a4bfb9-cf61-4040-ba6a-b48997b19fce","Type":"ContainerStarted","Data":"3684940af9ce34b8921e1603ccba78043a9d1b67a02274864a0d503598845e03"} Apr 23 08:12:51.569344 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.569303 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-99xdt" event={"ID":"4f98cb24-fbc1-4718-bb62-2d985cadf144","Type":"ContainerStarted","Data":"1ad890d410396a37ff10e7402ec9a87d8a26c650230311f450fc256b5c558aa6"} Apr 23 08:12:51.576460 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.576257 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tn7qf" event={"ID":"58cfa3d8-1954-40b4-ac7c-f082a1e07777","Type":"ContainerStarted","Data":"7dc71c2053c7186d7da778a97b3f4e94d9d45a0ef19187eb7d46e7d790263ca4"} Apr 23 08:12:51.603508 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.603479 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lwlgs" event={"ID":"49916299-de53-445c-b3f3-1fc134a6e4cc","Type":"ContainerStarted","Data":"96f53fee1311b1cdc3ba783fcd29df65211113d6dd5ca6c7bf84284dc3646789"} Apr 23 08:12:51.621285 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.621258 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" event={"ID":"cfbc15a5-2478-44d8-902f-9d7269635055","Type":"ContainerStarted","Data":"14abe2581423533df129f1cc90d6ed12e10c0ba0443186bbcfb811ef76d71991"} Apr 23 08:12:51.645858 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.645722 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerStarted","Data":"63ed633d9bb4e73831b125870501029d76a4a010f78c5952a5a972b78234d6aa"} Apr 23 08:12:51.679079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:51.679024 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"0938f67e2b3335e7941d279927c038183fc905ce124b5fb041e36bfae782be10"} Apr 23 08:12:52.005003 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.004923 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:52.005160 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.005090 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:52.005160 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.005146 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:54.005127838 +0000 UTC m=+5.059723087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:52.206907 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.206229 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:52.206907 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.206411 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:52.206907 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.206428 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:52.206907 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.206439 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:52.206907 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.206491 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:54.206473279 +0000 UTC m=+5.261068530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:52.370933 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.370904 2559 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:12:52.425948 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.425878 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:07:50 +0000 UTC" deadline="2027-10-31 06:40:08.333811978 +0000 UTC" Apr 23 08:12:52.425948 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.425920 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13342h27m15.907906242s" Apr 23 08:12:52.521921 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.521893 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:52.522087 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.522035 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:12:52.522478 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:52.522458 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:52.522582 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:52.522556 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:12:54.028851 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:54.028815 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:54.029304 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.028980 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:54.029304 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.029040 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:58.029020623 +0000 UTC m=+9.083615869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:54.231815 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:54.231125 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:54.231815 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.231354 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:54.231815 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.231375 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:54.231815 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.231388 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:54.231815 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.231443 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:12:58.231423924 +0000 UTC m=+9.286019170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:54.522586 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:54.522550 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:54.522586 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:54.522570 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:54.522845 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.522756 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:12:54.522977 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:54.522900 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:12:56.522544 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:56.522443 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:56.522544 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:56.522496 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:56.523115 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:56.522592 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:12:56.523115 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:56.522682 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:12:58.059670 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:58.059625 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:58.060148 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.059892 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:58.060148 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.059969 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:13:06.059945374 +0000 UTC m=+17.114540740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:12:58.261251 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:58.261187 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:58.261433 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.261339 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:12:58.261433 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.261360 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:12:58.261433 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.261374 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:58.261433 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.261429 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:13:06.261410403 +0000 UTC m=+17.316005665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:12:58.522827 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:58.522746 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:12:58.522997 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.522875 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:12:58.522997 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:12:58.522958 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:12:58.523109 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:12:58.523041 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:00.523005 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:00.522723 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:00.523407 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:00.522759 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:00.523407 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:00.523060 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:00.523407 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:00.523146 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:02.522244 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:02.522205 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:02.522657 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:02.522221 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:02.522657 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:02.522332 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:02.522657 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:02.522416 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:04.522495 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:04.522451 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:04.522965 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:04.522451 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:04.522965 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:04.522594 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:04.522965 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:04.522693 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:06.122213 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:06.122183 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:06.122692 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.122314 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:13:06.122692 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.122377 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:13:22.122361049 +0000 UTC m=+33.176956293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:13:06.323525 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:06.323479 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:06.323722 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.323671 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:13:06.323722 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.323691 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:13:06.323722 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.323718 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:13:06.323880 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.323768 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:13:22.323753638 +0000 UTC m=+33.378348883 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:13:06.522270 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:06.522187 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:06.522270 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:06.522210 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:06.522462 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.522282 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:06.522462 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:06.522440 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:08.521811 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:08.521792 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:08.521811 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:08.521805 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:08.522146 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:08.521874 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:08.522146 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:08.521932 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:09.719438 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.719074 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58xlf" event={"ID":"7a708d86-77eb-45d6-9cb2-d56380306e76","Type":"ContainerStarted","Data":"a3434971202270cf5a8e1cf95ca58adbe10799a15640920601b7ad798ddcd870"} Apr 23 08:13:09.720282 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.720262 2559 generic.go:358] "Generic (PLEG): container finished" podID="02f1197a0b1540bfd79dfadea3147481" containerID="3ebf65c9fe01ea31ca5be34873bb9022334010956c6526ab2a956b3c0e9e2afc" exitCode=0 Apr 23 08:13:09.720380 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.720312 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" event={"ID":"02f1197a0b1540bfd79dfadea3147481","Type":"ContainerDied","Data":"3ebf65c9fe01ea31ca5be34873bb9022334010956c6526ab2a956b3c0e9e2afc"} Apr 23 08:13:09.720487 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.720471 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" Apr 23 08:13:09.721472 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.721452 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" event={"ID":"ffda46011da9673651bbfae81aa86260","Type":"ContainerStarted","Data":"01039ec50b8894c191083300275bca524281eebb34728c29d122762f055a1cf4"} Apr 23 08:13:09.722598 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.722579 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6b6vz" event={"ID":"e3a4bfb9-cf61-4040-ba6a-b48997b19fce","Type":"ContainerStarted","Data":"0bb3abf766e1bee02a1e9ac432d11d1ac9a0302797a5706aba2b9e5be745a314"} Apr 23 08:13:09.723746 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.723727 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tn7qf" event={"ID":"58cfa3d8-1954-40b4-ac7c-f082a1e07777","Type":"ContainerStarted","Data":"21edabd01f988ac96fb0467d79224ce322790367e504eefaca7acb0795d1795f"} Apr 23 08:13:09.724780 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.724762 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lwlgs" event={"ID":"49916299-de53-445c-b3f3-1fc134a6e4cc","Type":"ContainerStarted","Data":"69f9b548728be843b50ffa75a2fb34b162745f31534fe53b604c74630d8b6624"} Apr 23 08:13:09.725913 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.725896 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" event={"ID":"cfbc15a5-2478-44d8-902f-9d7269635055","Type":"ContainerStarted","Data":"760a0baf811279091ee24112f73bac618988903b3b4e357428663e5ebd8ca255"} Apr 23 08:13:09.727022 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.727003 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" event={"ID":"159f214d-7437-426c-9ffa-1470336cf132","Type":"ContainerStarted","Data":"b510892dd81b0ea559dac8c91b717711b558c312e9609a46963b25bcf32b59d7"} Apr 23 08:13:09.728168 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.728152 2559 generic.go:358] "Generic (PLEG): container finished" podID="f22c4ac1-2d2c-4f45-9e34-42cd608dab18" containerID="319e6e6d67d33c199016389da2c627ff0e201922d4292649c6d1d6f0f863b1aa" exitCode=0 Apr 23 08:13:09.728238 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.728206 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerDied","Data":"319e6e6d67d33c199016389da2c627ff0e201922d4292649c6d1d6f0f863b1aa"} Apr 23 08:13:09.730410 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730395 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:13:09.730630 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730609 2559 generic.go:358] "Generic (PLEG): container finished" podID="be5235d2-68c0-4721-b23c-a3e24721ae65" containerID="9a4f39f7306987f13d3b8bc36cb3e6f9ae204b087a389e8a7893c8c65878a2f5" exitCode=1 Apr 23 08:13:09.730716 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730643 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"70d159a59664cf0bb8dcae04c3d607c16b816f02b80220a44f9634ce34513c02"} Apr 23 08:13:09.730716 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730661 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"672fff05d1c976595748b26345cf1aba824cae2c23a27dc83e062dffc1721ab3"} Apr 23 08:13:09.730716 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730673 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"2527ccd1d211985a25b34dbedfb91d9e2e72329adb4f920f31f6bd65d24acd69"} Apr 23 08:13:09.730716 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730682 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"4ece7ce8248e71ff9abb6518564d49b5b2344d3ac04f646d0126d56acdd42f34"} Apr 23 08:13:09.730716 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730690 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerDied","Data":"9a4f39f7306987f13d3b8bc36cb3e6f9ae204b087a389e8a7893c8c65878a2f5"} Apr 23 08:13:09.730716 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730712 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"b3db947a5f5d95e86bd5ccc201a35625b6654533355305f91f45de00fce538cf"} Apr 23 08:13:09.730916 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.730783 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:13:09.731896 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.731879 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal"] Apr 23 08:13:09.734109 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.734075 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-58xlf" podStartSLOduration=7.430644219 podStartE2EDuration="20.734064872s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.516963831 +0000 UTC m=+1.571559079" lastFinishedPulling="2026-04-23 08:13:03.820384472 +0000 UTC m=+14.874979732" observedRunningTime="2026-04-23 08:13:09.733619912 +0000 UTC m=+20.788215180" watchObservedRunningTime="2026-04-23 08:13:09.734064872 +0000 UTC m=+20.788660139" Apr 23 08:13:09.747394 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.747354 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6b6vz" podStartSLOduration=3.036847224 podStartE2EDuration="20.747340266s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.794730806 +0000 UTC m=+1.849326053" lastFinishedPulling="2026-04-23 08:13:08.50522385 +0000 UTC m=+19.559819095" observedRunningTime="2026-04-23 08:13:09.746417726 +0000 UTC m=+20.801012992" watchObservedRunningTime="2026-04-23 08:13:09.747340266 +0000 UTC m=+20.801935732" Apr 23 08:13:09.791285 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.791241 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-47.ec2.internal" podStartSLOduration=19.791225813 podStartE2EDuration="19.791225813s" podCreationTimestamp="2026-04-23 08:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:13:09.790892542 +0000 UTC m=+20.845487810" watchObservedRunningTime="2026-04-23 08:13:09.791225813 +0000 UTC m=+20.845821082" Apr 23 08:13:09.806471 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.806433 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lwlgs" podStartSLOduration=3.01916264 podStartE2EDuration="20.806422824s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.71794851 +0000 UTC m=+1.772543755" lastFinishedPulling="2026-04-23 08:13:08.505208691 +0000 UTC m=+19.559803939" observedRunningTime="2026-04-23 08:13:09.806228938 +0000 UTC m=+20.860824206" watchObservedRunningTime="2026-04-23 08:13:09.806422824 +0000 UTC m=+20.861018091" Apr 23 08:13:09.843281 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.843243 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gxzvp" podStartSLOduration=3.046273048 podStartE2EDuration="20.843232106s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.70937509 +0000 UTC m=+1.763970334" lastFinishedPulling="2026-04-23 08:13:08.506334132 +0000 UTC m=+19.560929392" observedRunningTime="2026-04-23 08:13:09.821486959 +0000 UTC m=+20.876082226" watchObservedRunningTime="2026-04-23 08:13:09.843232106 +0000 UTC m=+20.897827352" Apr 23 08:13:09.843417 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:09.843393 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tn7qf" podStartSLOduration=3.065690375 podStartE2EDuration="20.843388338s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.73068492 +0000 UTC m=+1.785280165" lastFinishedPulling="2026-04-23 08:13:08.508382882 +0000 UTC m=+19.562978128" observedRunningTime="2026-04-23 08:13:09.843315657 +0000 UTC m=+20.897910921" watchObservedRunningTime="2026-04-23 08:13:09.843388338 +0000 UTC m=+20.897983604" Apr 23 08:13:10.429496 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.429468 2559 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:13:10.463976 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.463885 2559 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:13:10.429492877Z","UUID":"54fd7d22-077d-4232-b8d9-16f1358dc425","Handler":null,"Name":"","Endpoint":""} Apr 23 08:13:10.465452 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.465431 2559 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:13:10.465550 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.465461 2559 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:13:10.522250 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.522225 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:10.522347 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.522225 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:10.522428 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:10.522315 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:10.522535 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:10.522441 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:10.734901 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.734830 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" event={"ID":"159f214d-7437-426c-9ffa-1470336cf132","Type":"ContainerStarted","Data":"9c8be86972f6b1c0a990e836bd5665d0ea3f3a1722620fa09e77f62f22a58e12"} Apr 23 08:13:10.736644 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.736604 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" event={"ID":"02f1197a0b1540bfd79dfadea3147481","Type":"ContainerStarted","Data":"1ccf19daf6494ca2038c6c1dbaa75fabefadad7ec77a4b9da26f259104708877"} Apr 23 08:13:10.738000 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.737972 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-99xdt" event={"ID":"4f98cb24-fbc1-4718-bb62-2d985cadf144","Type":"ContainerStarted","Data":"b74864e4a1c8fba660ba53951d2ef97bc007a9202c127dbc0cbd2ab3f2f6fa0a"} Apr 23 08:13:10.750647 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.750594 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-47.ec2.internal" podStartSLOduration=1.750579127 podStartE2EDuration="1.750579127s" podCreationTimestamp="2026-04-23 08:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:13:10.750519056 +0000 UTC m=+21.805114323" watchObservedRunningTime="2026-04-23 08:13:10.750579127 +0000 UTC m=+21.805174395" Apr 23 08:13:10.765473 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:10.765438 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-99xdt" podStartSLOduration=8.692527049 podStartE2EDuration="21.765426055s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.747483093 +0000 UTC m=+1.802078347" lastFinishedPulling="2026-04-23 08:13:03.820382093 +0000 UTC m=+14.874977353" observedRunningTime="2026-04-23 08:13:10.765349432 +0000 UTC m=+21.819944700" watchObservedRunningTime="2026-04-23 08:13:10.765426055 +0000 UTC m=+21.820021324" Apr 23 08:13:11.741651 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:11.741560 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" event={"ID":"159f214d-7437-426c-9ffa-1470336cf132","Type":"ContainerStarted","Data":"ed785793ad96d6727889c88987c6eb6e2d5673b74d6bca02028f6b3837b92d7e"} Apr 23 08:13:11.744546 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:11.744527 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:13:11.744906 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:11.744881 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"f7bdcb6c1c37c6efac0aa37e00d62e688b02cf7311550ceaf78662f1f817bcd5"} Apr 23 08:13:11.761602 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:11.761560 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r8lvw" podStartSLOduration=1.785979664 podStartE2EDuration="22.761544335s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.509410062 +0000 UTC m=+1.564005307" lastFinishedPulling="2026-04-23 08:13:11.48497472 +0000 UTC m=+22.539569978" observedRunningTime="2026-04-23 08:13:11.761245956 +0000 UTC m=+22.815841223" watchObservedRunningTime="2026-04-23 08:13:11.761544335 +0000 UTC m=+22.816139601" Apr 23 08:13:11.886688 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:11.886655 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:13:11.887395 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:11.887376 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:13:12.522757 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:12.522725 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:12.522757 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:12.522747 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:12.522957 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:12.522815 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:12.522957 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:12.522934 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:12.868094 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:12.868060 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:13:12.868778 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:12.868755 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6b6vz" Apr 23 08:13:14.522003 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.521884 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:14.522472 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.521893 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:14.522472 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:14.522082 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:14.522472 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:14.522165 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:14.752298 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.752238 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:13:14.752580 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.752561 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"725351c87c234ba6f1a34d81073b2bfec4dd6d3866c9fadcddd09742bf7353bc"} Apr 23 08:13:14.752874 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.752844 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:13:14.753003 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.752975 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:13:14.753095 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.753077 2559 scope.go:117] "RemoveContainer" containerID="9a4f39f7306987f13d3b8bc36cb3e6f9ae204b087a389e8a7893c8c65878a2f5" Apr 23 08:13:14.767058 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:14.767041 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:13:15.756329 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.756013 2559 generic.go:358] "Generic (PLEG): container finished" podID="f22c4ac1-2d2c-4f45-9e34-42cd608dab18" containerID="ec7bd07bf8bead8872cc19e9a4ddd8a6dda411d6fa3fadffa1e5aeea887c235e" exitCode=0 Apr 23 08:13:15.756767 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.756082 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerDied","Data":"ec7bd07bf8bead8872cc19e9a4ddd8a6dda411d6fa3fadffa1e5aeea887c235e"} Apr 23 08:13:15.760113 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.760045 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:13:15.760419 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.760395 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" event={"ID":"be5235d2-68c0-4721-b23c-a3e24721ae65","Type":"ContainerStarted","Data":"ff38a9903a2a6184b6e6fffd350a25ad3e5db7a01abab31ae28db392fa9ff755"} Apr 23 08:13:15.760765 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.760745 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:13:15.775972 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.775948 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:13:15.805311 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:15.804518 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" podStartSLOduration=8.805692165 podStartE2EDuration="26.804497954s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.752263739 +0000 UTC m=+1.806858988" lastFinishedPulling="2026-04-23 08:13:08.75106953 +0000 UTC m=+19.805664777" observedRunningTime="2026-04-23 08:13:15.803792227 +0000 UTC m=+26.858387495" watchObservedRunningTime="2026-04-23 08:13:15.804497954 +0000 UTC m=+26.859093291" Apr 23 08:13:16.471548 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:16.471520 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zxxfs"] Apr 23 08:13:16.471756 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:16.471652 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:16.471829 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:16.471770 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:16.473543 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:16.473517 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nkshk"] Apr 23 08:13:16.473648 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:16.473632 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:16.474733 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:16.473764 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:16.764040 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:16.764014 2559 generic.go:358] "Generic (PLEG): container finished" podID="f22c4ac1-2d2c-4f45-9e34-42cd608dab18" containerID="111fba5fb4ab1defae89130c0613abb99deade7ae11799bbbfaee91c5578a8b5" exitCode=0 Apr 23 08:13:16.764396 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:16.764093 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerDied","Data":"111fba5fb4ab1defae89130c0613abb99deade7ae11799bbbfaee91c5578a8b5"} Apr 23 08:13:17.770027 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:17.769755 2559 generic.go:358] "Generic (PLEG): container finished" podID="f22c4ac1-2d2c-4f45-9e34-42cd608dab18" containerID="6a3118c68ab8c51ce207db056d75bb94fff7ff42dd29c05f72996e1a579b2044" exitCode=0 Apr 23 08:13:17.770027 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:17.769843 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerDied","Data":"6a3118c68ab8c51ce207db056d75bb94fff7ff42dd29c05f72996e1a579b2044"} Apr 23 08:13:18.522095 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:18.522063 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:18.522095 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:18.522093 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:18.522295 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:18.522191 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:18.522464 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:18.522438 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:20.521930 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:20.521896 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:20.521930 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:20.521923 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:20.522508 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:20.522008 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zxxfs" podUID="d399127c-4640-46ee-b923-43ca2adc7c1e" Apr 23 08:13:20.522508 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:20.522132 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkshk" podUID="ad7c4f52-9a6e-427d-8a37-1c27216d412e" Apr 23 08:13:21.732924 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.732858 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-47.ec2.internal" event="NodeReady" Apr 23 08:13:21.733431 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.733002 2559 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:13:21.800657 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.800622 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qlns9"] Apr 23 08:13:21.829769 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.829742 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4jflg"] Apr 23 08:13:21.829904 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.829885 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:21.834983 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.834960 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:13:21.835395 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.835369 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dcms6\"" Apr 23 08:13:21.835513 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.835440 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:13:21.847596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.847575 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qlns9"] Apr 23 08:13:21.847596 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.847599 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4jflg"] Apr 23 08:13:21.847769 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.847692 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:21.850116 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.850097 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:13:21.850116 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.850109 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:13:21.850273 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.850168 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sb9bx\"" Apr 23 08:13:21.850835 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.850815 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:13:21.942208 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.942173 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjkr9\" (UniqueName: \"kubernetes.io/projected/61623770-f083-4741-a16c-427c7d637226-kube-api-access-bjkr9\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:21.942372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.942224 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61623770-f083-4741-a16c-427c7d637226-config-volume\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:21.942372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.942278 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:21.942372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.942298 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61623770-f083-4741-a16c-427c7d637226-tmp-dir\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:21.942372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.942331 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjgg\" (UniqueName: \"kubernetes.io/projected/273cd29d-2e09-4e88-b51d-9b39805b5849-kube-api-access-8zjgg\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:21.942567 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:21.942397 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:22.042926 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.042898 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:22.043079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.042938 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjkr9\" (UniqueName: \"kubernetes.io/projected/61623770-f083-4741-a16c-427c7d637226-kube-api-access-bjkr9\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.043079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.042974 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61623770-f083-4741-a16c-427c7d637226-config-volume\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.043079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.043014 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.043079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.043037 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61623770-f083-4741-a16c-427c7d637226-tmp-dir\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.043079 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.043058 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:22.043079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.043068 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjgg\" (UniqueName: \"kubernetes.io/projected/273cd29d-2e09-4e88-b51d-9b39805b5849-kube-api-access-8zjgg\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:22.043417 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.043127 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:22.543104701 +0000 UTC m=+33.597699947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:22.043417 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.043144 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:22.043417 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.043178 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:22.543167534 +0000 UTC m=+33.597762790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:22.043538 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.043456 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/61623770-f083-4741-a16c-427c7d637226-tmp-dir\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.043627 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.043611 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61623770-f083-4741-a16c-427c7d637226-config-volume\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.053797 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.053772 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjkr9\" (UniqueName: \"kubernetes.io/projected/61623770-f083-4741-a16c-427c7d637226-kube-api-access-bjkr9\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.063402 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.063383 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjgg\" (UniqueName: \"kubernetes.io/projected/273cd29d-2e09-4e88-b51d-9b39805b5849-kube-api-access-8zjgg\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:22.144025 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.143995 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:22.144188 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.144172 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:13:22.144260 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.144237 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:13:54.144220519 +0000 UTC m=+65.198815767 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:13:22.345776 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.345686 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:22.345933 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.345844 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:13:22.345933 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.345867 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:13:22.345933 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.345880 2559 projected.go:194] Error preparing data for projected volume kube-api-access-c88dv for pod openshift-network-diagnostics/network-check-target-zxxfs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:13:22.346053 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.345943 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv podName:d399127c-4640-46ee-b923-43ca2adc7c1e nodeName:}" failed. No retries permitted until 2026-04-23 08:13:54.345923084 +0000 UTC m=+65.400518329 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-c88dv" (UniqueName: "kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv") pod "network-check-target-zxxfs" (UID: "d399127c-4640-46ee-b923-43ca2adc7c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:13:22.522563 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.522528 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:22.522881 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.522856 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:22.525608 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.525528 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:13:22.525608 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.525545 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mdjsc\"" Apr 23 08:13:22.525608 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.525570 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:13:22.525867 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.525849 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:13:22.525867 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.525861 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f9mps\"" Apr 23 08:13:22.548115 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.548075 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:22.548214 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:22.548154 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:22.548214 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.548176 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:22.548332 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.548232 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:23.54820999 +0000 UTC m=+34.602805242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:22.548332 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.548277 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:22.548442 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:22.548336 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:23.548319368 +0000 UTC m=+34.602914614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:23.554507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:23.554470 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:23.555084 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:23.554535 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:23.555084 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:23.554631 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:23.555084 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:23.554644 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:23.555084 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:23.554726 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:25.554688148 +0000 UTC m=+36.609283394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:23.555084 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:23.554748 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:25.554738936 +0000 UTC m=+36.609334184 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:24.783793 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:24.783607 2559 generic.go:358] "Generic (PLEG): container finished" podID="f22c4ac1-2d2c-4f45-9e34-42cd608dab18" containerID="d471a8343b7e6998272943e1fb2f3b80fbb751799449e05fd867338e40a37940" exitCode=0 Apr 23 08:13:24.784126 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:24.783690 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerDied","Data":"d471a8343b7e6998272943e1fb2f3b80fbb751799449e05fd867338e40a37940"} Apr 23 08:13:25.567187 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:25.567156 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:25.567347 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:25.567203 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:25.567347 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:25.567290 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:25.567347 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:25.567335 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:29.567323085 +0000 UTC m=+40.621918330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:25.567455 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:25.567289 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:25.567455 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:25.567419 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:29.567406727 +0000 UTC m=+40.622001973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:25.788018 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:25.787988 2559 generic.go:358] "Generic (PLEG): container finished" podID="f22c4ac1-2d2c-4f45-9e34-42cd608dab18" containerID="1b83066a4b35e571f26594ee0caf1a9ef27de3f0a80ded7c45f471a2dfca551b" exitCode=0 Apr 23 08:13:25.788381 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:25.788044 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerDied","Data":"1b83066a4b35e571f26594ee0caf1a9ef27de3f0a80ded7c45f471a2dfca551b"} Apr 23 08:13:26.792655 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:26.792617 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" event={"ID":"f22c4ac1-2d2c-4f45-9e34-42cd608dab18","Type":"ContainerStarted","Data":"0f434d5ce09a23f5cc2c09004f87dd16c786f2e668e8217bd881571fdff349b8"} Apr 23 08:13:26.815733 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:26.815280 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5lbkz" podStartSLOduration=4.659213295 podStartE2EDuration="37.815261791s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:12:50.820499115 +0000 UTC m=+1.875094361" lastFinishedPulling="2026-04-23 08:13:23.97654761 +0000 UTC m=+35.031142857" observedRunningTime="2026-04-23 08:13:26.813598356 +0000 UTC m=+37.868193624" watchObservedRunningTime="2026-04-23 08:13:26.815261791 +0000 UTC m=+37.869857059" Apr 23 08:13:27.183245 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.183172 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt"] Apr 23 08:13:27.200474 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.200452 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f"] Apr 23 08:13:27.200620 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.200596 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.203245 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.203221 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 08:13:27.203368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.203231 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 08:13:27.203368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.203287 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-bqcxp\"" Apr 23 08:13:27.203368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.203311 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 08:13:27.203368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.203347 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 08:13:27.219104 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.219084 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt"] Apr 23 08:13:27.219104 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.219103 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f"] Apr 23 08:13:27.219215 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.219183 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.222003 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.221984 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 08:13:27.222093 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.222067 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 08:13:27.222150 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.222070 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 08:13:27.222203 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.222170 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 08:13:27.378719 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378678 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94a512a5-4f04-4412-86e9-72cbf7402365-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt\" (UID: \"94a512a5-4f04-4412-86e9-72cbf7402365\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.378819 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378733 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.378863 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378817 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxjn9\" (UniqueName: \"kubernetes.io/projected/de41f31a-aeec-4549-a4a7-be7ba5b68e41-kube-api-access-hxjn9\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.378863 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378856 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkxp\" (UniqueName: \"kubernetes.io/projected/94a512a5-4f04-4412-86e9-72cbf7402365-kube-api-access-tfkxp\") pod \"managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt\" (UID: \"94a512a5-4f04-4412-86e9-72cbf7402365\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.378926 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378874 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-ca\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.378962 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378942 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/de41f31a-aeec-4549-a4a7-be7ba5b68e41-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.378995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378978 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-hub\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.379027 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.378997 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479238 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479191 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/de41f31a-aeec-4549-a4a7-be7ba5b68e41-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479238 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479219 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-hub\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479239 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479372 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479259 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94a512a5-4f04-4412-86e9-72cbf7402365-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt\" (UID: \"94a512a5-4f04-4412-86e9-72cbf7402365\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.479473 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479372 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479473 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479436 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxjn9\" (UniqueName: \"kubernetes.io/projected/de41f31a-aeec-4549-a4a7-be7ba5b68e41-kube-api-access-hxjn9\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479571 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479490 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkxp\" (UniqueName: \"kubernetes.io/projected/94a512a5-4f04-4412-86e9-72cbf7402365-kube-api-access-tfkxp\") pod \"managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt\" (UID: \"94a512a5-4f04-4412-86e9-72cbf7402365\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.479571 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479539 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-ca\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.479978 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.479955 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/de41f31a-aeec-4549-a4a7-be7ba5b68e41-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.483089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.483059 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-ca\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.483169 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.483059 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.483169 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.483096 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.483335 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.483313 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94a512a5-4f04-4412-86e9-72cbf7402365-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt\" (UID: \"94a512a5-4f04-4412-86e9-72cbf7402365\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.483441 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.483425 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/de41f31a-aeec-4549-a4a7-be7ba5b68e41-hub\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.488645 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.488614 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxjn9\" (UniqueName: \"kubernetes.io/projected/de41f31a-aeec-4549-a4a7-be7ba5b68e41-kube-api-access-hxjn9\") pod \"cluster-proxy-proxy-agent-67dff55877-2xj5f\" (UID: \"de41f31a-aeec-4549-a4a7-be7ba5b68e41\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.489032 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.489013 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkxp\" (UniqueName: \"kubernetes.io/projected/94a512a5-4f04-4412-86e9-72cbf7402365-kube-api-access-tfkxp\") pod \"managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt\" (UID: \"94a512a5-4f04-4412-86e9-72cbf7402365\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.522926 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.522901 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" Apr 23 08:13:27.529530 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.529513 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:13:27.719621 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.719595 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f"] Apr 23 08:13:27.722253 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:13:27.722222 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde41f31a_aeec_4549_a4a7_be7ba5b68e41.slice/crio-8c0c6c9eb197b04c9837423d41a52dff647978be8ea30af93fdb5d0f694d7d9f WatchSource:0}: Error finding container 8c0c6c9eb197b04c9837423d41a52dff647978be8ea30af93fdb5d0f694d7d9f: Status 404 returned error can't find the container with id 8c0c6c9eb197b04c9837423d41a52dff647978be8ea30af93fdb5d0f694d7d9f Apr 23 08:13:27.724374 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.724351 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt"] Apr 23 08:13:27.728186 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:13:27.728164 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a512a5_4f04_4412_86e9_72cbf7402365.slice/crio-497dee758d3ffc078f0c5272bdb5dadd9192492178caddb795278cf3fe3b61be WatchSource:0}: Error finding container 497dee758d3ffc078f0c5272bdb5dadd9192492178caddb795278cf3fe3b61be: Status 404 returned error can't find the container with id 497dee758d3ffc078f0c5272bdb5dadd9192492178caddb795278cf3fe3b61be Apr 23 08:13:27.795074 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.795041 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" event={"ID":"de41f31a-aeec-4549-a4a7-be7ba5b68e41","Type":"ContainerStarted","Data":"8c0c6c9eb197b04c9837423d41a52dff647978be8ea30af93fdb5d0f694d7d9f"} Apr 23 08:13:27.797897 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:27.797870 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" event={"ID":"94a512a5-4f04-4412-86e9-72cbf7402365","Type":"ContainerStarted","Data":"497dee758d3ffc078f0c5272bdb5dadd9192492178caddb795278cf3fe3b61be"} Apr 23 08:13:29.599062 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:29.599030 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:29.599456 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:29.599100 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:29.599456 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:29.599175 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:29.599456 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:29.599193 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:29.599456 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:29.599249 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:37.599227207 +0000 UTC m=+48.653822452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:29.599456 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:29.599268 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:37.599259237 +0000 UTC m=+48.653854483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:32.809790 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:32.809746 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" event={"ID":"94a512a5-4f04-4412-86e9-72cbf7402365","Type":"ContainerStarted","Data":"71d0d95b21dc4fe7c0b56173b6a91dceebb4c4310641d71e726933b0b136431c"} Apr 23 08:13:32.811223 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:32.811200 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" event={"ID":"de41f31a-aeec-4549-a4a7-be7ba5b68e41","Type":"ContainerStarted","Data":"a092c3b0e44337dd05b6447b209ca03c7390ada41b27cd16c1e31e37498242c3"} Apr 23 08:13:32.824385 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:32.824343 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c4cb6d96c-n7fxt" podStartSLOduration=1.728936662 podStartE2EDuration="5.82432935s" podCreationTimestamp="2026-04-23 08:13:27 +0000 UTC" firstStartedPulling="2026-04-23 08:13:27.729784262 +0000 UTC m=+38.784379507" lastFinishedPulling="2026-04-23 08:13:31.82517695 +0000 UTC m=+42.879772195" observedRunningTime="2026-04-23 08:13:32.82358072 +0000 UTC m=+43.878175987" watchObservedRunningTime="2026-04-23 08:13:32.82432935 +0000 UTC m=+43.878924616" Apr 23 08:13:34.816640 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:34.816592 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" event={"ID":"de41f31a-aeec-4549-a4a7-be7ba5b68e41","Type":"ContainerStarted","Data":"7b41a6f2ea87474f89adfeb4cb5013789eb7d84c13a07cb09a73d2d3aee5aab1"} Apr 23 08:13:34.816640 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:34.816642 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" event={"ID":"de41f31a-aeec-4549-a4a7-be7ba5b68e41","Type":"ContainerStarted","Data":"973c82b30dbefd096e5654e43e683bd10778fb3d1df92d252f04471ed0651373"} Apr 23 08:13:34.837602 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:34.837567 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" podStartSLOduration=1.148948705 podStartE2EDuration="7.837554967s" podCreationTimestamp="2026-04-23 08:13:27 +0000 UTC" firstStartedPulling="2026-04-23 08:13:27.724265469 +0000 UTC m=+38.778860716" lastFinishedPulling="2026-04-23 08:13:34.41287172 +0000 UTC m=+45.467466978" observedRunningTime="2026-04-23 08:13:34.836460807 +0000 UTC m=+45.891056210" watchObservedRunningTime="2026-04-23 08:13:34.837554967 +0000 UTC m=+45.892150224" Apr 23 08:13:37.657454 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:37.657415 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:37.657839 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:37.657484 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:37.657839 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:37.657570 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:37.657839 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:37.657588 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:37.657839 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:37.657648 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:53.657629762 +0000 UTC m=+64.712225007 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:37.657839 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:37.657665 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:13:53.657657459 +0000 UTC m=+64.712252707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:47.781880 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:47.781854 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqpq2" Apr 23 08:13:53.757270 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:53.757231 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:13:53.757733 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:53.757295 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:13:53.757733 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:53.757372 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:13:53.757733 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:53.757422 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:13:53.757733 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:53.757456 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls podName:61623770-f083-4741-a16c-427c7d637226 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:25.757440294 +0000 UTC m=+96.812035540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls") pod "dns-default-qlns9" (UID: "61623770-f083-4741-a16c-427c7d637226") : secret "dns-default-metrics-tls" not found Apr 23 08:13:53.757733 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:53.757471 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert podName:273cd29d-2e09-4e88-b51d-9b39805b5849 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:25.757464253 +0000 UTC m=+96.812059497 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert") pod "ingress-canary-4jflg" (UID: "273cd29d-2e09-4e88-b51d-9b39805b5849") : secret "canary-serving-cert" not found Apr 23 08:13:54.160023 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.159993 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:13:54.162831 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.162811 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:13:54.170910 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:54.170895 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:13:54.170979 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:13:54.170943 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs podName:ad7c4f52-9a6e-427d-8a37-1c27216d412e nodeName:}" failed. No retries permitted until 2026-04-23 08:14:58.170929228 +0000 UTC m=+129.225524472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs") pod "network-metrics-daemon-nkshk" (UID: "ad7c4f52-9a6e-427d-8a37-1c27216d412e") : secret "metrics-daemon-secret" not found Apr 23 08:13:54.361674 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.361645 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:54.364506 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.364489 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:13:54.374624 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.374602 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:13:54.385710 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.385687 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c88dv\" (UniqueName: \"kubernetes.io/projected/d399127c-4640-46ee-b923-43ca2adc7c1e-kube-api-access-c88dv\") pod \"network-check-target-zxxfs\" (UID: \"d399127c-4640-46ee-b923-43ca2adc7c1e\") " pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:54.637711 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.637688 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mdjsc\"" Apr 23 08:13:54.645638 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.645619 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:54.774037 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.774008 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zxxfs"] Apr 23 08:13:54.776591 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:13:54.776561 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd399127c_4640_46ee_b923_43ca2adc7c1e.slice/crio-d4aabfb63b9475c16f8b75025460b92f66daf03f175e0c254aea13544b61b50b WatchSource:0}: Error finding container d4aabfb63b9475c16f8b75025460b92f66daf03f175e0c254aea13544b61b50b: Status 404 returned error can't find the container with id d4aabfb63b9475c16f8b75025460b92f66daf03f175e0c254aea13544b61b50b Apr 23 08:13:54.854483 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:54.854454 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zxxfs" event={"ID":"d399127c-4640-46ee-b923-43ca2adc7c1e","Type":"ContainerStarted","Data":"d4aabfb63b9475c16f8b75025460b92f66daf03f175e0c254aea13544b61b50b"} Apr 23 08:13:57.863074 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:57.863043 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zxxfs" event={"ID":"d399127c-4640-46ee-b923-43ca2adc7c1e","Type":"ContainerStarted","Data":"0ba8a0032edfa58f2211597d8cadd9794ebc6aef7b05215fd48fcf8e4111a5e6"} Apr 23 08:13:57.863407 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:57.863192 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:13:57.882802 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:13:57.882758 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zxxfs" podStartSLOduration=66.179335235 podStartE2EDuration="1m8.88274394s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:13:54.778551234 +0000 UTC m=+65.833146479" lastFinishedPulling="2026-04-23 08:13:57.481959936 +0000 UTC m=+68.536555184" observedRunningTime="2026-04-23 08:13:57.882180775 +0000 UTC m=+68.936776042" watchObservedRunningTime="2026-04-23 08:13:57.88274394 +0000 UTC m=+68.937339234" Apr 23 08:14:05.009672 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:05.009641 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58xlf_7a708d86-77eb-45d6-9cb2-d56380306e76/dns-node-resolver/0.log" Apr 23 08:14:06.209357 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:06.209330 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lwlgs_49916299-de53-445c-b3f3-1fc134a6e4cc/node-ca/0.log" Apr 23 08:14:25.773949 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:25.773802 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:14:25.773949 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:25.773878 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:14:25.776231 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:25.776207 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61623770-f083-4741-a16c-427c7d637226-metrics-tls\") pod \"dns-default-qlns9\" (UID: \"61623770-f083-4741-a16c-427c7d637226\") " pod="openshift-dns/dns-default-qlns9" Apr 23 08:14:25.776368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:25.776348 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/273cd29d-2e09-4e88-b51d-9b39805b5849-cert\") pod \"ingress-canary-4jflg\" (UID: \"273cd29d-2e09-4e88-b51d-9b39805b5849\") " pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:14:26.043523 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.043504 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dcms6\"" Apr 23 08:14:26.051688 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.051663 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qlns9" Apr 23 08:14:26.059634 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.059616 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sb9bx\"" Apr 23 08:14:26.067693 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.067676 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4jflg" Apr 23 08:14:26.174830 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.174800 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qlns9"] Apr 23 08:14:26.178600 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:26.178576 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61623770_f083_4741_a16c_427c7d637226.slice/crio-fdbcba49281f43891db03702247901edc7458de7455e23ab218e17141188f9e0 WatchSource:0}: Error finding container fdbcba49281f43891db03702247901edc7458de7455e23ab218e17141188f9e0: Status 404 returned error can't find the container with id fdbcba49281f43891db03702247901edc7458de7455e23ab218e17141188f9e0 Apr 23 08:14:26.193023 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.193001 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4jflg"] Apr 23 08:14:26.195381 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:26.195358 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273cd29d_2e09_4e88_b51d_9b39805b5849.slice/crio-bcd449b76cf00d935a6fc9e4cf54b15f488dccab12d3d9903664f09527eb617e WatchSource:0}: Error finding container bcd449b76cf00d935a6fc9e4cf54b15f488dccab12d3d9903664f09527eb617e: Status 404 returned error can't find the container with id bcd449b76cf00d935a6fc9e4cf54b15f488dccab12d3d9903664f09527eb617e Apr 23 08:14:26.936795 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.936736 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4jflg" event={"ID":"273cd29d-2e09-4e88-b51d-9b39805b5849","Type":"ContainerStarted","Data":"bcd449b76cf00d935a6fc9e4cf54b15f488dccab12d3d9903664f09527eb617e"} Apr 23 08:14:26.938033 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:26.937999 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qlns9" event={"ID":"61623770-f083-4741-a16c-427c7d637226","Type":"ContainerStarted","Data":"fdbcba49281f43891db03702247901edc7458de7455e23ab218e17141188f9e0"} Apr 23 08:14:27.942852 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:27.942801 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qlns9" event={"ID":"61623770-f083-4741-a16c-427c7d637226","Type":"ContainerStarted","Data":"d8458536e75392346d2b077133570cbf79999d0a36fa09781325d67dab3996a2"} Apr 23 08:14:27.942852 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:27.942848 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qlns9" event={"ID":"61623770-f083-4741-a16c-427c7d637226","Type":"ContainerStarted","Data":"31142fd02e9e7a9fd1235fe9aa4c58755db9b59d82364bb5158a37eec206ec80"} Apr 23 08:14:27.943381 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:27.942995 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qlns9" Apr 23 08:14:27.960153 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:27.960112 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qlns9" podStartSLOduration=65.661510297 podStartE2EDuration="1m6.960100853s" podCreationTimestamp="2026-04-23 08:13:21 +0000 UTC" firstStartedPulling="2026-04-23 08:14:26.180333894 +0000 UTC m=+97.234929142" lastFinishedPulling="2026-04-23 08:14:27.478924449 +0000 UTC m=+98.533519698" observedRunningTime="2026-04-23 08:14:27.959480211 +0000 UTC m=+99.014075477" watchObservedRunningTime="2026-04-23 08:14:27.960100853 +0000 UTC m=+99.014696121" Apr 23 08:14:28.868898 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:28.868871 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zxxfs" Apr 23 08:14:28.946872 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:28.946841 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4jflg" event={"ID":"273cd29d-2e09-4e88-b51d-9b39805b5849","Type":"ContainerStarted","Data":"74b3a11e655a3be5773ec911224f6f0b0d0057255338cbfa47ef396166323c9e"} Apr 23 08:14:28.961200 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:28.961158 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4jflg" podStartSLOduration=66.004761062 podStartE2EDuration="1m7.961144971s" podCreationTimestamp="2026-04-23 08:13:21 +0000 UTC" firstStartedPulling="2026-04-23 08:14:26.19701691 +0000 UTC m=+97.251612155" lastFinishedPulling="2026-04-23 08:14:28.153400816 +0000 UTC m=+99.207996064" observedRunningTime="2026-04-23 08:14:28.960282409 +0000 UTC m=+100.014877689" watchObservedRunningTime="2026-04-23 08:14:28.961144971 +0000 UTC m=+100.015740237" Apr 23 08:14:31.939693 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.939661 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xmsss"] Apr 23 08:14:31.942693 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.942672 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:31.945132 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.945108 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:14:31.946134 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.946109 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:14:31.946233 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.946109 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vtw6x\"" Apr 23 08:14:31.946233 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.946184 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:14:31.946233 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.946197 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:14:31.960014 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:31.959987 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xmsss"] Apr 23 08:14:32.020105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.020081 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.020209 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.020111 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-crio-socket\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.020209 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.020129 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.020209 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.020151 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-data-volume\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.020314 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.020249 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84cg\" (UniqueName: \"kubernetes.io/projected/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-kube-api-access-z84cg\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121190 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121168 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-data-volume\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121274 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121198 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z84cg\" (UniqueName: \"kubernetes.io/projected/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-kube-api-access-z84cg\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121347 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121328 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121390 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121366 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-crio-socket\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121448 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121391 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121566 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121462 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-data-volume\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121566 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121463 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-crio-socket\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.121822 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.121806 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.123692 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.123672 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.133208 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.133184 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84cg\" (UniqueName: \"kubernetes.io/projected/ab7bfdf3-4a34-45ea-8800-45dc49f9cca7-kube-api-access-z84cg\") pod \"insights-runtime-extractor-xmsss\" (UID: \"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7\") " pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.252044 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.251993 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xmsss" Apr 23 08:14:32.363672 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.363645 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xmsss"] Apr 23 08:14:32.366967 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:32.366941 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab7bfdf3_4a34_45ea_8800_45dc49f9cca7.slice/crio-cf42e6bf01c95d7318971265e4a316564529531db893f54bde57b06743e3fd8a WatchSource:0}: Error finding container cf42e6bf01c95d7318971265e4a316564529531db893f54bde57b06743e3fd8a: Status 404 returned error can't find the container with id cf42e6bf01c95d7318971265e4a316564529531db893f54bde57b06743e3fd8a Apr 23 08:14:32.959922 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.959891 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xmsss" event={"ID":"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7","Type":"ContainerStarted","Data":"c112e042d911226ed94d411467f446470e60f5f9cc3a0a176121ee766af801d8"} Apr 23 08:14:32.959922 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:32.959924 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xmsss" event={"ID":"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7","Type":"ContainerStarted","Data":"cf42e6bf01c95d7318971265e4a316564529531db893f54bde57b06743e3fd8a"} Apr 23 08:14:33.964351 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:33.964309 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xmsss" event={"ID":"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7","Type":"ContainerStarted","Data":"285a48701cf0d3ee74504dc5ea8ce3d404b42f559cf471feded261a12923908a"} Apr 23 08:14:34.968285 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:34.968252 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xmsss" event={"ID":"ab7bfdf3-4a34-45ea-8800-45dc49f9cca7","Type":"ContainerStarted","Data":"41fc1aca768f8c0a6d35f594caba3c156a7c822eb2a7b8e94a69246d8a03fdbb"} Apr 23 08:14:34.985867 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:34.985825 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xmsss" podStartSLOduration=2.086743261 podStartE2EDuration="3.985812443s" podCreationTimestamp="2026-04-23 08:14:31 +0000 UTC" firstStartedPulling="2026-04-23 08:14:32.424759324 +0000 UTC m=+103.479354572" lastFinishedPulling="2026-04-23 08:14:34.323828505 +0000 UTC m=+105.378423754" observedRunningTime="2026-04-23 08:14:34.985232158 +0000 UTC m=+106.039827425" watchObservedRunningTime="2026-04-23 08:14:34.985812443 +0000 UTC m=+106.040407709" Apr 23 08:14:37.306938 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.306898 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qhmgn"] Apr 23 08:14:37.308909 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.308889 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.311331 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.311303 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 08:14:37.311481 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.311369 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 08:14:37.312310 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.312294 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:14:37.312398 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.312372 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:14:37.312447 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.312429 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:14:37.312539 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.312522 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7wqhd\"" Apr 23 08:14:37.317346 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.317322 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qhmgn"] Apr 23 08:14:37.353911 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.353880 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57mm\" (UniqueName: \"kubernetes.io/projected/f1caca1b-e23c-4bb4-8fd1-3980b966f382-kube-api-access-p57mm\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.354020 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.353948 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.354020 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.354003 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1caca1b-e23c-4bb4-8fd1-3980b966f382-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.354098 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.354040 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.454264 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.454233 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.454373 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.454274 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1caca1b-e23c-4bb4-8fd1-3980b966f382-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.454373 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.454316 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.454373 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.454350 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p57mm\" (UniqueName: \"kubernetes.io/projected/f1caca1b-e23c-4bb4-8fd1-3980b966f382-kube-api-access-p57mm\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.454534 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:14:37.454369 2559 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 08:14:37.454534 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:14:37.454432 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-tls podName:f1caca1b-e23c-4bb4-8fd1-3980b966f382 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:37.954417053 +0000 UTC m=+109.009012305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-qhmgn" (UID: "f1caca1b-e23c-4bb4-8fd1-3980b966f382") : secret "prometheus-operator-tls" not found Apr 23 08:14:37.454988 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.454968 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1caca1b-e23c-4bb4-8fd1-3980b966f382-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.456772 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.456752 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.463888 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.463862 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57mm\" (UniqueName: \"kubernetes.io/projected/f1caca1b-e23c-4bb4-8fd1-3980b966f382-kube-api-access-p57mm\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.948747 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.948692 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qlns9" Apr 23 08:14:37.958295 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.958271 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:37.960529 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:37.960508 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1caca1b-e23c-4bb4-8fd1-3980b966f382-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qhmgn\" (UID: \"f1caca1b-e23c-4bb4-8fd1-3980b966f382\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:38.219065 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:38.219000 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" Apr 23 08:14:38.336428 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:38.336406 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qhmgn"] Apr 23 08:14:38.338523 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:38.338498 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1caca1b_e23c_4bb4_8fd1_3980b966f382.slice/crio-ce456f4001122243a2f672331dbfd3c2ecdaf96840ca8693f5ba3a9ad9ebe134 WatchSource:0}: Error finding container ce456f4001122243a2f672331dbfd3c2ecdaf96840ca8693f5ba3a9ad9ebe134: Status 404 returned error can't find the container with id ce456f4001122243a2f672331dbfd3c2ecdaf96840ca8693f5ba3a9ad9ebe134 Apr 23 08:14:38.978527 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:38.978492 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" event={"ID":"f1caca1b-e23c-4bb4-8fd1-3980b966f382","Type":"ContainerStarted","Data":"ce456f4001122243a2f672331dbfd3c2ecdaf96840ca8693f5ba3a9ad9ebe134"} Apr 23 08:14:39.982336 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:39.982304 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" event={"ID":"f1caca1b-e23c-4bb4-8fd1-3980b966f382","Type":"ContainerStarted","Data":"0c7950ecec455efb98e49f20b26026767df40e6187fcc98069e9f8a1a12efe18"} Apr 23 08:14:39.982336 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:39.982337 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" event={"ID":"f1caca1b-e23c-4bb4-8fd1-3980b966f382","Type":"ContainerStarted","Data":"5e2ac8c6fff6edc83e7ef1e562e1ca00a99fbb90e8cb415e5bddfa103d72785a"} Apr 23 08:14:39.998117 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:39.998073 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-qhmgn" podStartSLOduration=1.8559580150000001 podStartE2EDuration="2.998061113s" podCreationTimestamp="2026-04-23 08:14:37 +0000 UTC" firstStartedPulling="2026-04-23 08:14:38.340238253 +0000 UTC m=+109.394833498" lastFinishedPulling="2026-04-23 08:14:39.482341347 +0000 UTC m=+110.536936596" observedRunningTime="2026-04-23 08:14:39.997644262 +0000 UTC m=+111.052239534" watchObservedRunningTime="2026-04-23 08:14:39.998061113 +0000 UTC m=+111.052656417" Apr 23 08:14:41.639777 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.639749 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wkz4q"] Apr 23 08:14:41.641656 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.641641 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.644090 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.644068 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x74z6\"" Apr 23 08:14:41.644301 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.644241 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:14:41.644408 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.644347 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:14:41.644928 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.644858 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:14:41.784355 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784333 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxqsp\" (UniqueName: \"kubernetes.io/projected/8acc3a89-0290-48b6-8242-999d9b623b35-kube-api-access-nxqsp\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784469 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784365 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-textfile\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784469 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784387 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784469 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784461 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-sys\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784594 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784506 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-wtmp\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784594 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784561 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-root\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784594 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784587 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-accelerators-collector-config\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784731 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784606 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-tls\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.784731 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.784631 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8acc3a89-0290-48b6-8242-999d9b623b35-metrics-client-ca\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.884919 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.884890 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-sys\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.884919 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.884921 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-wtmp\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.884947 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-root\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.884963 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-accelerators-collector-config\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885010 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-sys\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885019 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-root\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885046 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-tls\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885089 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885086 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8acc3a89-0290-48b6-8242-999d9b623b35-metrics-client-ca\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885112 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxqsp\" (UniqueName: \"kubernetes.io/projected/8acc3a89-0290-48b6-8242-999d9b623b35-kube-api-access-nxqsp\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885108 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-wtmp\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885222 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-textfile\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885368 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885261 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885585 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885565 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8acc3a89-0290-48b6-8242-999d9b623b35-metrics-client-ca\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885635 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885563 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-accelerators-collector-config\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.885635 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.885571 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-textfile\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.887381 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.887360 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.887489 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.887470 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8acc3a89-0290-48b6-8242-999d9b623b35-node-exporter-tls\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.894690 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.894642 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxqsp\" (UniqueName: \"kubernetes.io/projected/8acc3a89-0290-48b6-8242-999d9b623b35-kube-api-access-nxqsp\") pod \"node-exporter-wkz4q\" (UID: \"8acc3a89-0290-48b6-8242-999d9b623b35\") " pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.950695 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.950675 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wkz4q" Apr 23 08:14:41.958077 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:41.958055 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8acc3a89_0290_48b6_8242_999d9b623b35.slice/crio-1b0b55282b6cd6b69846b18aa8315bf8693acd88581daff9e617edc1e33871bd WatchSource:0}: Error finding container 1b0b55282b6cd6b69846b18aa8315bf8693acd88581daff9e617edc1e33871bd: Status 404 returned error can't find the container with id 1b0b55282b6cd6b69846b18aa8315bf8693acd88581daff9e617edc1e33871bd Apr 23 08:14:41.987758 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:41.987737 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wkz4q" event={"ID":"8acc3a89-0290-48b6-8242-999d9b623b35","Type":"ContainerStarted","Data":"1b0b55282b6cd6b69846b18aa8315bf8693acd88581daff9e617edc1e33871bd"} Apr 23 08:14:42.991192 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:42.991112 2559 generic.go:358] "Generic (PLEG): container finished" podID="8acc3a89-0290-48b6-8242-999d9b623b35" containerID="13de2e1a4b41958021e62d7fc32e0e9700885bf4136163beb59b8b01b5760771" exitCode=0 Apr 23 08:14:42.991532 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:42.991189 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wkz4q" event={"ID":"8acc3a89-0290-48b6-8242-999d9b623b35","Type":"ContainerDied","Data":"13de2e1a4b41958021e62d7fc32e0e9700885bf4136163beb59b8b01b5760771"} Apr 23 08:14:43.995389 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:43.995354 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wkz4q" event={"ID":"8acc3a89-0290-48b6-8242-999d9b623b35","Type":"ContainerStarted","Data":"6c34898e43743370119f6287cd7ad4d590bcb56ecc5824cf93e6a0677ddd337d"} Apr 23 08:14:43.995389 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:43.995388 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wkz4q" event={"ID":"8acc3a89-0290-48b6-8242-999d9b623b35","Type":"ContainerStarted","Data":"812855722ba032a41cc29a40f7d178b150546b0bb7afd8297c06ebf35bb2c2f4"} Apr 23 08:14:44.013656 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:44.013611 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wkz4q" podStartSLOduration=2.373841922 podStartE2EDuration="3.013598489s" podCreationTimestamp="2026-04-23 08:14:41 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.959893222 +0000 UTC m=+113.014488482" lastFinishedPulling="2026-04-23 08:14:42.599649796 +0000 UTC m=+113.654245049" observedRunningTime="2026-04-23 08:14:44.01353966 +0000 UTC m=+115.068134938" watchObservedRunningTime="2026-04-23 08:14:44.013598489 +0000 UTC m=+115.068193771" Apr 23 08:14:55.059026 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.058994 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cf99d454d-ljf9n"] Apr 23 08:14:55.061105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.061089 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.063641 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.063615 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:14:55.064842 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.064646 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:14:55.064842 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.064669 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:14:55.064842 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.064682 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:14:55.064842 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.064714 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:14:55.064842 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.064738 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-s87b2\"" Apr 23 08:14:55.064842 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.064750 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:14:55.065154 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.065141 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:14:55.071556 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.071537 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf99d454d-ljf9n"] Apr 23 08:14:55.176872 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.176845 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-serving-cert\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.176976 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.176876 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-service-ca\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.176976 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.176917 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-oauth-config\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.176976 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.176940 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-oauth-serving-cert\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.176976 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.176962 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-console-config\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.177110 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.176979 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpz44\" (UniqueName: \"kubernetes.io/projected/c86e2627-5829-4910-9869-1f89c4aadaa0-kube-api-access-cpz44\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.278218 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.278188 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-serving-cert\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.278218 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.278224 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-service-ca\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.278331 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.278264 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-oauth-config\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.278331 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.278289 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-oauth-serving-cert\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.278331 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.278313 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-console-config\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.278463 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.278335 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpz44\" (UniqueName: \"kubernetes.io/projected/c86e2627-5829-4910-9869-1f89c4aadaa0-kube-api-access-cpz44\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.279037 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.279018 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-console-config\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.279146 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.279059 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-service-ca\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.279146 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.279061 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-oauth-serving-cert\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.280677 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.280657 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-serving-cert\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.280783 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.280763 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-oauth-config\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.287324 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.287299 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpz44\" (UniqueName: \"kubernetes.io/projected/c86e2627-5829-4910-9869-1f89c4aadaa0-kube-api-access-cpz44\") pod \"console-7cf99d454d-ljf9n\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.370726 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.370649 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:14:55.485574 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:55.485546 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf99d454d-ljf9n"] Apr 23 08:14:55.489161 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:55.489138 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86e2627_5829_4910_9869_1f89c4aadaa0.slice/crio-75b2f07b14b1cda4364f79947b61e5319d8fbedca5d653be4a7ae28549b6005b WatchSource:0}: Error finding container 75b2f07b14b1cda4364f79947b61e5319d8fbedca5d653be4a7ae28549b6005b: Status 404 returned error can't find the container with id 75b2f07b14b1cda4364f79947b61e5319d8fbedca5d653be4a7ae28549b6005b Apr 23 08:14:56.029231 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:56.029201 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf99d454d-ljf9n" event={"ID":"c86e2627-5829-4910-9869-1f89c4aadaa0","Type":"ContainerStarted","Data":"75b2f07b14b1cda4364f79947b61e5319d8fbedca5d653be4a7ae28549b6005b"} Apr 23 08:14:58.202205 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:58.202155 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:14:58.204835 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:58.204809 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad7c4f52-9a6e-427d-8a37-1c27216d412e-metrics-certs\") pod \"network-metrics-daemon-nkshk\" (UID: \"ad7c4f52-9a6e-427d-8a37-1c27216d412e\") " pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:14:58.241671 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:58.241639 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f9mps\"" Apr 23 08:14:58.249753 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:58.249733 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkshk" Apr 23 08:14:58.763230 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:58.763205 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nkshk"] Apr 23 08:14:58.765834 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:14:58.765806 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7c4f52_9a6e_427d_8a37_1c27216d412e.slice/crio-d06c36b229403bd4cb865e63e598682d6514bac44d67373bb673ae371477062c WatchSource:0}: Error finding container d06c36b229403bd4cb865e63e598682d6514bac44d67373bb673ae371477062c: Status 404 returned error can't find the container with id d06c36b229403bd4cb865e63e598682d6514bac44d67373bb673ae371477062c Apr 23 08:14:59.037484 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:59.037404 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkshk" event={"ID":"ad7c4f52-9a6e-427d-8a37-1c27216d412e","Type":"ContainerStarted","Data":"d06c36b229403bd4cb865e63e598682d6514bac44d67373bb673ae371477062c"} Apr 23 08:14:59.038660 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:59.038631 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf99d454d-ljf9n" event={"ID":"c86e2627-5829-4910-9869-1f89c4aadaa0","Type":"ContainerStarted","Data":"d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa"} Apr 23 08:14:59.055146 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:14:59.055104 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cf99d454d-ljf9n" podStartSLOduration=0.874194448 podStartE2EDuration="4.055091183s" podCreationTimestamp="2026-04-23 08:14:55 +0000 UTC" firstStartedPulling="2026-04-23 08:14:55.490838974 +0000 UTC m=+126.545434219" lastFinishedPulling="2026-04-23 08:14:58.671735696 +0000 UTC m=+129.726330954" observedRunningTime="2026-04-23 08:14:59.055015179 +0000 UTC m=+130.109610443" watchObservedRunningTime="2026-04-23 08:14:59.055091183 +0000 UTC m=+130.109686448" Apr 23 08:15:00.042562 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:00.042525 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkshk" event={"ID":"ad7c4f52-9a6e-427d-8a37-1c27216d412e","Type":"ContainerStarted","Data":"41a69b13af8aba0075026bf6aba72cd1736056daad0805ee6b99a48e55661a47"} Apr 23 08:15:00.042932 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:00.042570 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkshk" event={"ID":"ad7c4f52-9a6e-427d-8a37-1c27216d412e","Type":"ContainerStarted","Data":"94eb0d222d43409097b27b0bec59b069fb30d4a95b0464e05a1cb502d29305e5"} Apr 23 08:15:00.057242 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:00.057199 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nkshk" podStartSLOduration=130.117230736 podStartE2EDuration="2m11.057184119s" podCreationTimestamp="2026-04-23 08:12:49 +0000 UTC" firstStartedPulling="2026-04-23 08:14:58.767801341 +0000 UTC m=+129.822396590" lastFinishedPulling="2026-04-23 08:14:59.707754728 +0000 UTC m=+130.762349973" observedRunningTime="2026-04-23 08:15:00.056740979 +0000 UTC m=+131.111336243" watchObservedRunningTime="2026-04-23 08:15:00.057184119 +0000 UTC m=+131.111779388" Apr 23 08:15:05.371026 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:05.370980 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:15:05.371026 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:05.371034 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:15:05.375743 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:05.375722 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:15:06.062862 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:06.062831 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:15:07.531025 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:07.530970 2559 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" podUID="de41f31a-aeec-4549-a4a7-be7ba5b68e41" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:15:15.130114 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:15.130084 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cf99d454d-ljf9n"] Apr 23 08:15:17.531286 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:17.531241 2559 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" podUID="de41f31a-aeec-4549-a4a7-be7ba5b68e41" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:15:23.597242 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:23.597212 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qlns9_61623770-f083-4741-a16c-427c7d637226/dns/0.log" Apr 23 08:15:23.602544 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:23.602520 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qlns9_61623770-f083-4741-a16c-427c7d637226/kube-rbac-proxy/0.log" Apr 23 08:15:23.689946 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:23.689925 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58xlf_7a708d86-77eb-45d6-9cb2-d56380306e76/dns-node-resolver/0.log" Apr 23 08:15:27.530863 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:27.530827 2559 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" podUID="de41f31a-aeec-4549-a4a7-be7ba5b68e41" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:15:27.531228 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:27.530897 2559 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" Apr 23 08:15:27.531337 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:27.531309 2559 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"7b41a6f2ea87474f89adfeb4cb5013789eb7d84c13a07cb09a73d2d3aee5aab1"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 08:15:27.531373 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:27.531357 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" podUID="de41f31a-aeec-4549-a4a7-be7ba5b68e41" containerName="service-proxy" containerID="cri-o://7b41a6f2ea87474f89adfeb4cb5013789eb7d84c13a07cb09a73d2d3aee5aab1" gracePeriod=30 Apr 23 08:15:28.118130 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:28.118096 2559 generic.go:358] "Generic (PLEG): container finished" podID="de41f31a-aeec-4549-a4a7-be7ba5b68e41" containerID="7b41a6f2ea87474f89adfeb4cb5013789eb7d84c13a07cb09a73d2d3aee5aab1" exitCode=2 Apr 23 08:15:28.118287 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:28.118161 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" event={"ID":"de41f31a-aeec-4549-a4a7-be7ba5b68e41","Type":"ContainerDied","Data":"7b41a6f2ea87474f89adfeb4cb5013789eb7d84c13a07cb09a73d2d3aee5aab1"} Apr 23 08:15:28.118287 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:28.118200 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dff55877-2xj5f" event={"ID":"de41f31a-aeec-4549-a4a7-be7ba5b68e41","Type":"ContainerStarted","Data":"7ce794eb6421cb7ced26a5a36e9ef8f48d6be82464fc9db56a229bfec43071ec"} Apr 23 08:15:40.148860 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.148796 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cf99d454d-ljf9n" podUID="c86e2627-5829-4910-9869-1f89c4aadaa0" containerName="console" containerID="cri-o://d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa" gracePeriod=15 Apr 23 08:15:40.373679 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.373656 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cf99d454d-ljf9n_c86e2627-5829-4910-9869-1f89c4aadaa0/console/0.log" Apr 23 08:15:40.373793 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.373743 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:15:40.487795 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.487729 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-serving-cert\") pod \"c86e2627-5829-4910-9869-1f89c4aadaa0\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " Apr 23 08:15:40.487795 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.487774 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-oauth-config\") pod \"c86e2627-5829-4910-9869-1f89c4aadaa0\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " Apr 23 08:15:40.487964 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.487821 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-service-ca\") pod \"c86e2627-5829-4910-9869-1f89c4aadaa0\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " Apr 23 08:15:40.487964 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.487856 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-oauth-serving-cert\") pod \"c86e2627-5829-4910-9869-1f89c4aadaa0\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " Apr 23 08:15:40.487964 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.487890 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-console-config\") pod \"c86e2627-5829-4910-9869-1f89c4aadaa0\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " Apr 23 08:15:40.488115 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.488025 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpz44\" (UniqueName: \"kubernetes.io/projected/c86e2627-5829-4910-9869-1f89c4aadaa0-kube-api-access-cpz44\") pod \"c86e2627-5829-4910-9869-1f89c4aadaa0\" (UID: \"c86e2627-5829-4910-9869-1f89c4aadaa0\") " Apr 23 08:15:40.488286 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.488264 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-console-config" (OuterVolumeSpecName: "console-config") pod "c86e2627-5829-4910-9869-1f89c4aadaa0" (UID: "c86e2627-5829-4910-9869-1f89c4aadaa0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:15:40.488346 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.488278 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-service-ca" (OuterVolumeSpecName: "service-ca") pod "c86e2627-5829-4910-9869-1f89c4aadaa0" (UID: "c86e2627-5829-4910-9869-1f89c4aadaa0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:15:40.488346 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.488274 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c86e2627-5829-4910-9869-1f89c4aadaa0" (UID: "c86e2627-5829-4910-9869-1f89c4aadaa0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:15:40.490154 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.490130 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86e2627-5829-4910-9869-1f89c4aadaa0-kube-api-access-cpz44" (OuterVolumeSpecName: "kube-api-access-cpz44") pod "c86e2627-5829-4910-9869-1f89c4aadaa0" (UID: "c86e2627-5829-4910-9869-1f89c4aadaa0"). InnerVolumeSpecName "kube-api-access-cpz44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:15:40.490284 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.490258 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c86e2627-5829-4910-9869-1f89c4aadaa0" (UID: "c86e2627-5829-4910-9869-1f89c4aadaa0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:15:40.490327 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.490271 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c86e2627-5829-4910-9869-1f89c4aadaa0" (UID: "c86e2627-5829-4910-9869-1f89c4aadaa0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:15:40.588827 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.588802 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpz44\" (UniqueName: \"kubernetes.io/projected/c86e2627-5829-4910-9869-1f89c4aadaa0-kube-api-access-cpz44\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:15:40.588827 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.588825 2559 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-serving-cert\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:15:40.588960 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.588839 2559 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c86e2627-5829-4910-9869-1f89c4aadaa0-console-oauth-config\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:15:40.588960 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.588852 2559 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-service-ca\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:15:40.588960 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.588865 2559 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-oauth-serving-cert\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:15:40.588960 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:40.588879 2559 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c86e2627-5829-4910-9869-1f89c4aadaa0-console-config\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:15:41.157810 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.157786 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cf99d454d-ljf9n_c86e2627-5829-4910-9869-1f89c4aadaa0/console/0.log" Apr 23 08:15:41.158252 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.157823 2559 generic.go:358] "Generic (PLEG): container finished" podID="c86e2627-5829-4910-9869-1f89c4aadaa0" containerID="d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa" exitCode=2 Apr 23 08:15:41.158252 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.157912 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf99d454d-ljf9n" event={"ID":"c86e2627-5829-4910-9869-1f89c4aadaa0","Type":"ContainerDied","Data":"d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa"} Apr 23 08:15:41.158252 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.157912 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf99d454d-ljf9n" Apr 23 08:15:41.158252 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.157953 2559 scope.go:117] "RemoveContainer" containerID="d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa" Apr 23 08:15:41.158252 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.157940 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf99d454d-ljf9n" event={"ID":"c86e2627-5829-4910-9869-1f89c4aadaa0","Type":"ContainerDied","Data":"75b2f07b14b1cda4364f79947b61e5319d8fbedca5d653be4a7ae28549b6005b"} Apr 23 08:15:41.165848 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.165820 2559 scope.go:117] "RemoveContainer" containerID="d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa" Apr 23 08:15:41.166096 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:15:41.166072 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa\": container with ID starting with d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa not found: ID does not exist" containerID="d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa" Apr 23 08:15:41.166152 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.166109 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa"} err="failed to get container status \"d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa\": rpc error: code = NotFound desc = could not find container \"d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa\": container with ID starting with d5c9ae0cd8f8e6fc5aaeaf53ccf756ac9b3237fdaebfbcb3abfdb59e2bf1dcfa not found: ID does not exist" Apr 23 08:15:41.177774 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.177750 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cf99d454d-ljf9n"] Apr 23 08:15:41.181677 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.181657 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cf99d454d-ljf9n"] Apr 23 08:15:41.525638 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:15:41.525568 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86e2627-5829-4910-9869-1f89c4aadaa0" path="/var/lib/kubelet/pods/c86e2627-5829-4910-9869-1f89c4aadaa0/volumes" Apr 23 08:16:11.849316 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.849284 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-577cccbdf7-56trt"] Apr 23 08:16:11.849884 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.849597 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86e2627-5829-4910-9869-1f89c4aadaa0" containerName="console" Apr 23 08:16:11.849884 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.849619 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86e2627-5829-4910-9869-1f89c4aadaa0" containerName="console" Apr 23 08:16:11.849884 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.849681 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="c86e2627-5829-4910-9869-1f89c4aadaa0" containerName="console" Apr 23 08:16:11.851508 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.851487 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.853821 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.853802 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:16:11.854667 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.854649 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:16:11.854751 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.854734 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:16:11.854894 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.854874 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:16:11.854977 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.854913 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-s87b2\"" Apr 23 08:16:11.854977 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.854957 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:16:11.855076 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.855011 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:16:11.855076 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.855023 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:16:11.858985 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.858961 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:16:11.861238 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.861215 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-577cccbdf7-56trt"] Apr 23 08:16:11.884053 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884033 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-config\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.884148 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884068 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-oauth-serving-cert\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.884148 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884085 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288sj\" (UniqueName: \"kubernetes.io/projected/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-kube-api-access-288sj\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.884148 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884130 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-trusted-ca-bundle\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.884251 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884157 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-service-ca\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.884251 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884181 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-oauth-config\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.884251 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.884227 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-serving-cert\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.984937 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.984912 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-oauth-config\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985021 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.984940 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-serving-cert\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985021 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.984966 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-config\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985105 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985086 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-oauth-serving-cert\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985142 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985121 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-288sj\" (UniqueName: \"kubernetes.io/projected/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-kube-api-access-288sj\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985274 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985256 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-trusted-ca-bundle\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985348 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985297 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-service-ca\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985584 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985560 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-config\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985795 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985772 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-oauth-serving-cert\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985925 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985907 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-service-ca\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.985978 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.985953 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-trusted-ca-bundle\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.987445 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.987416 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-serving-cert\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.987523 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.987451 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-oauth-config\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:11.992905 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:11.992882 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-288sj\" (UniqueName: \"kubernetes.io/projected/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-kube-api-access-288sj\") pod \"console-577cccbdf7-56trt\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:12.161275 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:12.161207 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:12.297079 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:12.297051 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-577cccbdf7-56trt"] Apr 23 08:16:12.301639 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:16:12.301614 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc45dfd3f_7969_4e71_a85f_4f6e6d22067d.slice/crio-0063ae9c07ae284affe23e44f7ab11de299200c6fc7882723529922aacfbc5f1 WatchSource:0}: Error finding container 0063ae9c07ae284affe23e44f7ab11de299200c6fc7882723529922aacfbc5f1: Status 404 returned error can't find the container with id 0063ae9c07ae284affe23e44f7ab11de299200c6fc7882723529922aacfbc5f1 Apr 23 08:16:13.241135 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:13.241097 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577cccbdf7-56trt" event={"ID":"c45dfd3f-7969-4e71-a85f-4f6e6d22067d","Type":"ContainerStarted","Data":"1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c"} Apr 23 08:16:13.241135 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:13.241133 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577cccbdf7-56trt" event={"ID":"c45dfd3f-7969-4e71-a85f-4f6e6d22067d","Type":"ContainerStarted","Data":"0063ae9c07ae284affe23e44f7ab11de299200c6fc7882723529922aacfbc5f1"} Apr 23 08:16:13.258471 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:13.258420 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-577cccbdf7-56trt" podStartSLOduration=2.25840745 podStartE2EDuration="2.25840745s" podCreationTimestamp="2026-04-23 08:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:16:13.25811255 +0000 UTC m=+204.312707827" watchObservedRunningTime="2026-04-23 08:16:13.25840745 +0000 UTC m=+204.313002717" Apr 23 08:16:22.161511 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:22.161467 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:22.161511 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:22.161520 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:22.165897 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:22.165877 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:16:22.268796 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:16:22.268773 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:17:49.395754 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:17:49.395721 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:17:49.396295 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:17:49.396138 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:17:49.400868 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:17:49.400848 2559 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:17:50.588620 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:17:50.588588 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-577cccbdf7-56trt"] Apr 23 08:18:02.682309 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.682273 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s7j99"] Apr 23 08:18:02.683999 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.683982 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.686280 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.686261 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:18:02.693104 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.693082 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s7j99"] Apr 23 08:18:02.715227 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.715199 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d297d32-84b7-40c3-9e54-25294e32e307-original-pull-secret\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.715227 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.715230 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d297d32-84b7-40c3-9e54-25294e32e307-kubelet-config\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.715374 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.715253 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d297d32-84b7-40c3-9e54-25294e32e307-dbus\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.816222 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.816189 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d297d32-84b7-40c3-9e54-25294e32e307-original-pull-secret\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.816222 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.816221 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d297d32-84b7-40c3-9e54-25294e32e307-kubelet-config\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.816415 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.816248 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d297d32-84b7-40c3-9e54-25294e32e307-dbus\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.816415 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.816321 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d297d32-84b7-40c3-9e54-25294e32e307-kubelet-config\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.816415 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.816388 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d297d32-84b7-40c3-9e54-25294e32e307-dbus\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.818472 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.818453 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d297d32-84b7-40c3-9e54-25294e32e307-original-pull-secret\") pod \"global-pull-secret-syncer-s7j99\" (UID: \"9d297d32-84b7-40c3-9e54-25294e32e307\") " pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:02.992544 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:02.992450 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s7j99" Apr 23 08:18:03.101213 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:03.101184 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s7j99"] Apr 23 08:18:03.103973 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:18:03.103947 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d297d32_84b7_40c3_9e54_25294e32e307.slice/crio-eca15c2b1f078c2bb8507deb3d8c6b406f2222abf7d9fcb37833f716882f58c6 WatchSource:0}: Error finding container eca15c2b1f078c2bb8507deb3d8c6b406f2222abf7d9fcb37833f716882f58c6: Status 404 returned error can't find the container with id eca15c2b1f078c2bb8507deb3d8c6b406f2222abf7d9fcb37833f716882f58c6 Apr 23 08:18:03.105409 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:03.105394 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:18:03.525423 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:03.525390 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s7j99" event={"ID":"9d297d32-84b7-40c3-9e54-25294e32e307","Type":"ContainerStarted","Data":"eca15c2b1f078c2bb8507deb3d8c6b406f2222abf7d9fcb37833f716882f58c6"} Apr 23 08:18:07.535693 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:07.535659 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s7j99" event={"ID":"9d297d32-84b7-40c3-9e54-25294e32e307","Type":"ContainerStarted","Data":"8f8664bb81d75e914e2b774efc0b62c73af25d1fb7e78b2d0db3f48628726a32"} Apr 23 08:18:07.552138 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:07.552092 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s7j99" podStartSLOduration=1.3250748620000001 podStartE2EDuration="5.552078129s" podCreationTimestamp="2026-04-23 08:18:02 +0000 UTC" firstStartedPulling="2026-04-23 08:18:03.105514098 +0000 UTC m=+314.160109343" lastFinishedPulling="2026-04-23 08:18:07.332517362 +0000 UTC m=+318.387112610" observedRunningTime="2026-04-23 08:18:07.550809223 +0000 UTC m=+318.605404493" watchObservedRunningTime="2026-04-23 08:18:07.552078129 +0000 UTC m=+318.606673374" Apr 23 08:18:15.610040 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.609998 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-577cccbdf7-56trt" podUID="c45dfd3f-7969-4e71-a85f-4f6e6d22067d" containerName="console" containerID="cri-o://1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c" gracePeriod=15 Apr 23 08:18:15.833757 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.833736 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-577cccbdf7-56trt_c45dfd3f-7969-4e71-a85f-4f6e6d22067d/console/0.log" Apr 23 08:18:15.833882 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.833808 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:18:15.907644 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907585 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-config\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.907644 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907620 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-oauth-serving-cert\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.907644 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907640 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-service-ca\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.907894 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907660 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-trusted-ca-bundle\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.907894 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907681 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-serving-cert\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.907894 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907783 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-288sj\" (UniqueName: \"kubernetes.io/projected/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-kube-api-access-288sj\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.907894 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.907817 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-oauth-config\") pod \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\" (UID: \"c45dfd3f-7969-4e71-a85f-4f6e6d22067d\") " Apr 23 08:18:15.908131 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.908107 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-config" (OuterVolumeSpecName: "console-config") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:15.908182 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.908157 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-service-ca" (OuterVolumeSpecName: "service-ca") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:15.908229 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.908163 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:15.908229 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.908177 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:15.909763 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.909739 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:15.909855 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.909833 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-kube-api-access-288sj" (OuterVolumeSpecName: "kube-api-access-288sj") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "kube-api-access-288sj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:18:15.909984 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:15.909969 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c45dfd3f-7969-4e71-a85f-4f6e6d22067d" (UID: "c45dfd3f-7969-4e71-a85f-4f6e6d22067d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:16.008404 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008384 2559 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-config\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.008507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008407 2559 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-oauth-serving-cert\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.008507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008421 2559 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-service-ca\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.008507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008433 2559 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-trusted-ca-bundle\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.008507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008448 2559 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-serving-cert\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.008507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008461 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-288sj\" (UniqueName: \"kubernetes.io/projected/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-kube-api-access-288sj\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.008507 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.008474 2559 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c45dfd3f-7969-4e71-a85f-4f6e6d22067d-console-oauth-config\") on node \"ip-10-0-133-47.ec2.internal\" DevicePath \"\"" Apr 23 08:18:16.560446 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.560419 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-577cccbdf7-56trt_c45dfd3f-7969-4e71-a85f-4f6e6d22067d/console/0.log" Apr 23 08:18:16.560610 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.560460 2559 generic.go:358] "Generic (PLEG): container finished" podID="c45dfd3f-7969-4e71-a85f-4f6e6d22067d" containerID="1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c" exitCode=2 Apr 23 08:18:16.560610 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.560514 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577cccbdf7-56trt" event={"ID":"c45dfd3f-7969-4e71-a85f-4f6e6d22067d","Type":"ContainerDied","Data":"1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c"} Apr 23 08:18:16.560610 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.560536 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577cccbdf7-56trt" event={"ID":"c45dfd3f-7969-4e71-a85f-4f6e6d22067d","Type":"ContainerDied","Data":"0063ae9c07ae284affe23e44f7ab11de299200c6fc7882723529922aacfbc5f1"} Apr 23 08:18:16.560610 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.560541 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577cccbdf7-56trt" Apr 23 08:18:16.560883 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.560552 2559 scope.go:117] "RemoveContainer" containerID="1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c" Apr 23 08:18:16.570918 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.570896 2559 scope.go:117] "RemoveContainer" containerID="1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c" Apr 23 08:18:16.571206 ip-10-0-133-47 kubenswrapper[2559]: E0423 08:18:16.571188 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c\": container with ID starting with 1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c not found: ID does not exist" containerID="1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c" Apr 23 08:18:16.571263 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.571212 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c"} err="failed to get container status \"1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c\": rpc error: code = NotFound desc = could not find container \"1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c\": container with ID starting with 1e51afe9bc8cc4433c1a59383fe1ac9fe7e5c1d81cf2dde3118e0a25f759b91c not found: ID does not exist" Apr 23 08:18:16.582430 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.582410 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-577cccbdf7-56trt"] Apr 23 08:18:16.586998 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:16.586965 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-577cccbdf7-56trt"] Apr 23 08:18:17.525829 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:18:17.525799 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c45dfd3f-7969-4e71-a85f-4f6e6d22067d" path="/var/lib/kubelet/pods/c45dfd3f-7969-4e71-a85f-4f6e6d22067d/volumes" Apr 23 08:19:20.413389 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.413360 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x67hq/must-gather-9ltr7"] Apr 23 08:19:20.413873 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.413572 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c45dfd3f-7969-4e71-a85f-4f6e6d22067d" containerName="console" Apr 23 08:19:20.413873 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.413582 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45dfd3f-7969-4e71-a85f-4f6e6d22067d" containerName="console" Apr 23 08:19:20.413873 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.413628 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="c45dfd3f-7969-4e71-a85f-4f6e6d22067d" containerName="console" Apr 23 08:19:20.416193 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.416176 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.418608 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.418589 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x67hq\"/\"default-dockercfg-kgww6\"" Apr 23 08:19:20.418744 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.418591 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x67hq\"/\"openshift-service-ca.crt\"" Apr 23 08:19:20.419576 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.419558 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x67hq\"/\"kube-root-ca.crt\"" Apr 23 08:19:20.422604 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.422584 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/must-gather-9ltr7"] Apr 23 08:19:20.520236 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.520212 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/addd689a-7c4a-45e4-a702-107770d9adc1-must-gather-output\") pod \"must-gather-9ltr7\" (UID: \"addd689a-7c4a-45e4-a702-107770d9adc1\") " pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.520337 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.520252 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvqk\" (UniqueName: \"kubernetes.io/projected/addd689a-7c4a-45e4-a702-107770d9adc1-kube-api-access-rxvqk\") pod \"must-gather-9ltr7\" (UID: \"addd689a-7c4a-45e4-a702-107770d9adc1\") " pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.620919 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.620900 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/addd689a-7c4a-45e4-a702-107770d9adc1-must-gather-output\") pod \"must-gather-9ltr7\" (UID: \"addd689a-7c4a-45e4-a702-107770d9adc1\") " pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.621014 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.620925 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvqk\" (UniqueName: \"kubernetes.io/projected/addd689a-7c4a-45e4-a702-107770d9adc1-kube-api-access-rxvqk\") pod \"must-gather-9ltr7\" (UID: \"addd689a-7c4a-45e4-a702-107770d9adc1\") " pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.621234 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.621216 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/addd689a-7c4a-45e4-a702-107770d9adc1-must-gather-output\") pod \"must-gather-9ltr7\" (UID: \"addd689a-7c4a-45e4-a702-107770d9adc1\") " pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.628861 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.628840 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvqk\" (UniqueName: \"kubernetes.io/projected/addd689a-7c4a-45e4-a702-107770d9adc1-kube-api-access-rxvqk\") pod \"must-gather-9ltr7\" (UID: \"addd689a-7c4a-45e4-a702-107770d9adc1\") " pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.725860 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.725799 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/must-gather-9ltr7" Apr 23 08:19:20.837300 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:20.837208 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/must-gather-9ltr7"] Apr 23 08:19:20.839902 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:19:20.839875 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaddd689a_7c4a_45e4_a702_107770d9adc1.slice/crio-2cc2a74a5edb8b842ecd0400d71372c523b3e2a5cb2adb1feb3ca8e86a6339f9 WatchSource:0}: Error finding container 2cc2a74a5edb8b842ecd0400d71372c523b3e2a5cb2adb1feb3ca8e86a6339f9: Status 404 returned error can't find the container with id 2cc2a74a5edb8b842ecd0400d71372c523b3e2a5cb2adb1feb3ca8e86a6339f9 Apr 23 08:19:21.729984 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:21.729929 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/must-gather-9ltr7" event={"ID":"addd689a-7c4a-45e4-a702-107770d9adc1","Type":"ContainerStarted","Data":"2cc2a74a5edb8b842ecd0400d71372c523b3e2a5cb2adb1feb3ca8e86a6339f9"} Apr 23 08:19:22.733687 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:22.733657 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/must-gather-9ltr7" event={"ID":"addd689a-7c4a-45e4-a702-107770d9adc1","Type":"ContainerStarted","Data":"8366bf8fc980e815bd372575f2365d90af3dfec4127b3dcf0f02391fd6a1e9b1"} Apr 23 08:19:22.733687 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:22.733691 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/must-gather-9ltr7" event={"ID":"addd689a-7c4a-45e4-a702-107770d9adc1","Type":"ContainerStarted","Data":"cdc1f8fd6f65888c8e962d46e41a748504f05193544b8f1a47a2d9511a32fe87"} Apr 23 08:19:22.749826 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:22.749579 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x67hq/must-gather-9ltr7" podStartSLOduration=1.824019958 podStartE2EDuration="2.749563462s" podCreationTimestamp="2026-04-23 08:19:20 +0000 UTC" firstStartedPulling="2026-04-23 08:19:20.841667894 +0000 UTC m=+391.896263142" lastFinishedPulling="2026-04-23 08:19:21.767211387 +0000 UTC m=+392.821806646" observedRunningTime="2026-04-23 08:19:22.748848085 +0000 UTC m=+393.803443354" watchObservedRunningTime="2026-04-23 08:19:22.749563462 +0000 UTC m=+393.804158731" Apr 23 08:19:23.059046 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:23.059017 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s7j99_9d297d32-84b7-40c3-9e54-25294e32e307/global-pull-secret-syncer/0.log" Apr 23 08:19:23.193995 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:23.193967 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6b6vz_e3a4bfb9-cf61-4040-ba6a-b48997b19fce/konnectivity-agent/0.log" Apr 23 08:19:23.267077 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:23.267043 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-47.ec2.internal_ffda46011da9673651bbfae81aa86260/haproxy/0.log" Apr 23 08:19:26.779092 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:26.779058 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wkz4q_8acc3a89-0290-48b6-8242-999d9b623b35/node-exporter/0.log" Apr 23 08:19:26.798969 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:26.798889 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wkz4q_8acc3a89-0290-48b6-8242-999d9b623b35/kube-rbac-proxy/0.log" Apr 23 08:19:26.819130 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:26.819108 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wkz4q_8acc3a89-0290-48b6-8242-999d9b623b35/init-textfile/0.log" Apr 23 08:19:27.086298 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:27.086264 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-qhmgn_f1caca1b-e23c-4bb4-8fd1-3980b966f382/prometheus-operator/0.log" Apr 23 08:19:27.102920 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:27.102894 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-qhmgn_f1caca1b-e23c-4bb4-8fd1-3980b966f382/kube-rbac-proxy/0.log" Apr 23 08:19:29.455483 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.455448 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b"] Apr 23 08:19:29.460314 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.460289 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.472422 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.472382 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b"] Apr 23 08:19:29.587870 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.587839 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-proc\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.587870 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.587872 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxtr\" (UniqueName: \"kubernetes.io/projected/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-kube-api-access-cdxtr\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.588055 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.587893 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-lib-modules\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.588055 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.588007 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-podres\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.588055 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.588037 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-sys\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.688516 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.688485 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-podres\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.688947 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.688927 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-sys\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689060 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.688869 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-podres\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689060 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.689055 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-proc\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689208 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.689084 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxtr\" (UniqueName: \"kubernetes.io/projected/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-kube-api-access-cdxtr\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689208 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.689109 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-lib-modules\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689208 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.689119 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-sys\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689208 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.689119 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-proc\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.689338 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.689224 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-lib-modules\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.696777 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.696755 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxtr\" (UniqueName: \"kubernetes.io/projected/f2c907ac-f3d9-45bf-aa74-5991f8b58be6-kube-api-access-cdxtr\") pod \"perf-node-gather-daemonset-rnk7b\" (UID: \"f2c907ac-f3d9-45bf-aa74-5991f8b58be6\") " pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.772261 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.772192 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:29.900356 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:29.900323 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b"] Apr 23 08:19:29.903397 ip-10-0-133-47 kubenswrapper[2559]: W0423 08:19:29.903370 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf2c907ac_f3d9_45bf_aa74_5991f8b58be6.slice/crio-427e3d9b69f1bbf7fd18ac1695c66a0e7d26b8f4f3bb3fe8ecb194e4fb2f0a42 WatchSource:0}: Error finding container 427e3d9b69f1bbf7fd18ac1695c66a0e7d26b8f4f3bb3fe8ecb194e4fb2f0a42: Status 404 returned error can't find the container with id 427e3d9b69f1bbf7fd18ac1695c66a0e7d26b8f4f3bb3fe8ecb194e4fb2f0a42 Apr 23 08:19:30.102358 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.102324 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qlns9_61623770-f083-4741-a16c-427c7d637226/dns/0.log" Apr 23 08:19:30.123910 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.123891 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qlns9_61623770-f083-4741-a16c-427c7d637226/kube-rbac-proxy/0.log" Apr 23 08:19:30.185243 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.185226 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58xlf_7a708d86-77eb-45d6-9cb2-d56380306e76/dns-node-resolver/0.log" Apr 23 08:19:30.689020 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.688940 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lwlgs_49916299-de53-445c-b3f3-1fc134a6e4cc/node-ca/0.log" Apr 23 08:19:30.763257 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.763229 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" event={"ID":"f2c907ac-f3d9-45bf-aa74-5991f8b58be6","Type":"ContainerStarted","Data":"e00d60626c89153176709e1475db04d9a964c691026f04f7243498341e527a9c"} Apr 23 08:19:30.763257 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.763262 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" event={"ID":"f2c907ac-f3d9-45bf-aa74-5991f8b58be6","Type":"ContainerStarted","Data":"427e3d9b69f1bbf7fd18ac1695c66a0e7d26b8f4f3bb3fe8ecb194e4fb2f0a42"} Apr 23 08:19:30.763483 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.763346 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:30.778845 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:30.778806 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" podStartSLOduration=1.7787942650000002 podStartE2EDuration="1.778794265s" podCreationTimestamp="2026-04-23 08:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:19:30.777867068 +0000 UTC m=+401.832462337" watchObservedRunningTime="2026-04-23 08:19:30.778794265 +0000 UTC m=+401.833389532" Apr 23 08:19:31.606167 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:31.606127 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4jflg_273cd29d-2e09-4e88-b51d-9b39805b5849/serve-healthcheck-canary/0.log" Apr 23 08:19:32.080231 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:32.080206 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xmsss_ab7bfdf3-4a34-45ea-8800-45dc49f9cca7/kube-rbac-proxy/0.log" Apr 23 08:19:32.099778 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:32.099754 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xmsss_ab7bfdf3-4a34-45ea-8800-45dc49f9cca7/exporter/0.log" Apr 23 08:19:32.120613 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:32.120590 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xmsss_ab7bfdf3-4a34-45ea-8800-45dc49f9cca7/extractor/0.log" Apr 23 08:19:36.777217 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.777181 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x67hq/perf-node-gather-daemonset-rnk7b" Apr 23 08:19:36.892978 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.892954 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/kube-multus-additional-cni-plugins/0.log" Apr 23 08:19:36.912206 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.912177 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/egress-router-binary-copy/0.log" Apr 23 08:19:36.931340 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.931322 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/cni-plugins/0.log" Apr 23 08:19:36.952576 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.952559 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/bond-cni-plugin/0.log" Apr 23 08:19:36.970167 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.970150 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/routeoverride-cni/0.log" Apr 23 08:19:36.990428 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:36.990413 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/whereabouts-cni-bincopy/0.log" Apr 23 08:19:37.009315 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:37.009300 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5lbkz_f22c4ac1-2d2c-4f45-9e34-42cd608dab18/whereabouts-cni/0.log" Apr 23 08:19:37.435985 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:37.435956 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tn7qf_58cfa3d8-1954-40b4-ac7c-f082a1e07777/kube-multus/0.log" Apr 23 08:19:37.595390 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:37.595351 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nkshk_ad7c4f52-9a6e-427d-8a37-1c27216d412e/network-metrics-daemon/0.log" Apr 23 08:19:37.621199 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:37.621160 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nkshk_ad7c4f52-9a6e-427d-8a37-1c27216d412e/kube-rbac-proxy/0.log" Apr 23 08:19:38.410185 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.410156 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-controller/0.log" Apr 23 08:19:38.430252 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.430205 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/0.log" Apr 23 08:19:38.434585 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.434561 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovn-acl-logging/1.log" Apr 23 08:19:38.458990 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.458950 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/kube-rbac-proxy-node/0.log" Apr 23 08:19:38.481264 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.481234 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 08:19:38.500727 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.498348 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/northd/0.log" Apr 23 08:19:38.527407 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.527382 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/nbdb/0.log" Apr 23 08:19:38.547225 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.547187 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/sbdb/0.log" Apr 23 08:19:38.703779 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:38.703686 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqpq2_be5235d2-68c0-4721-b23c-a3e24721ae65/ovnkube-controller/0.log" Apr 23 08:19:40.257136 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:40.257112 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zxxfs_d399127c-4640-46ee-b923-43ca2adc7c1e/network-check-target-container/0.log" Apr 23 08:19:41.038127 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:41.038098 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-99xdt_4f98cb24-fbc1-4718-bb62-2d985cadf144/iptables-alerter/0.log" Apr 23 08:19:41.728864 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:41.728836 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gxzvp_cfbc15a5-2478-44d8-902f-9d7269635055/tuned/0.log" Apr 23 08:19:44.754587 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:44.754544 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-r8lvw_159f214d-7437-426c-9ffa-1470336cf132/csi-driver/0.log" Apr 23 08:19:44.774031 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:44.773996 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-r8lvw_159f214d-7437-426c-9ffa-1470336cf132/csi-node-driver-registrar/0.log" Apr 23 08:19:44.794403 ip-10-0-133-47 kubenswrapper[2559]: I0423 08:19:44.794379 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-r8lvw_159f214d-7437-426c-9ffa-1470336cf132/csi-liveness-probe/0.log"